I have some questions on how to proceed at the bottom of this post.
In the host simulation environment, one cannot interact with the running simulation using the standard Morphic "event chain".
One cannot do so because from the host environment, there is no World in the simulation; there is only a painted picture of a World. The key to seeing this is to grok what a running simulation actually is.
StackInterpreterSimulator >> run
"..simulation setup code.."
[true] whileTrue: [self assertValidExecutionPointers. atEachStepBlock value. "N.B. may be nil" self dispatchOn: currentBytecode in: BytecodeTable. self incrementByteCount].
"..we never get here..."
There is no 'hook' for me to chain the Morphic event's down into, there are only bytecodes being fetched and executed . (which is totally cool and awesome, btw).
Those bytecodes paint a pretty picture on an ImageMorph placed on a Frame placed within a SystemWindow as seen here in
StackInterpreterSimulater >> openAsMorph
"...."
window addMorph: (displayView := ImageMorph new image: displayForm) frame: (0@0 corner: 1@0.8).
"...."
That image is not a PasteUpMorph named TheWorld that can respond to events--it is just a pretty picture.
That ladies and gentleman is the point where us fat and coddled coder Hobbit's have to enter the dragon's cave, leaving the comfortable environment of the Squeak API for whatever strange and dangerous world lies below; Me? I decided to post here instead before venturing in (:
So, having learned this the hard way, there are a couple of strategies I could take--RFB, or maybe Nebraska-- to get events over to the target image but before doing that I have to ask is it really necessary?
My goal is to port the StackInterpeter to native 64 (and after that 64x64). Presumably Eliot managed to do the original work without the use of direct interaction with morphic on the running world, so shouldn't my task now be to emulate whatever techiques Eliot uses?
If so, that raises the final point. What exactly are those techniques? Specifically, given a running simulation, how does one produce a new VM? Do we boot up the target image on which we have developed and run VMMaker the standard way?
Finally, direct interaction with the runnig simulation world is desirable, I will be happy to implement it. Just please inform me of what strategy you would prefer and I will try to get it done.
Thank you for your time.
Hi Tty,
On Wed, Jan 22, 2014 at 5:23 AM, gettimothy gettimothy@zoho.com wrote:
I have some questions on how to proceed at the bottom of this post.
In the host simulation environment, one cannot interact with the running simulation using the standard Morphic "event chain".
One cannot do so because from the host environment, there is no World in the simulation; there is only a painted picture of a World. The key to seeing this is to grok what a running simulation actually is.
StackInterpreterSimulator >> run
"..simulation setup code.." [true] whileTrue: [self assertValidExecutionPointers. atEachStepBlock value. "N.B. may be nil" self dispatchOn: currentBytecode in: BytecodeTable. self incrementByteCount]. "..we never get here..."
There is no 'hook' for me to chain the Morphic event's down into, there are only bytecodes being fetched and executed . (which is totally cool and awesome, btw).
Those bytecodes paint a pretty picture on an ImageMorph placed on a Frame placed within a SystemWindow as seen here in
StackInterpreterSimulater >> openAsMorph
"...." window addMorph: (displayView := ImageMorph new image: displayForm) frame: (0@0 corner: 1@0.8). "...."
That image is not a PasteUpMorph named TheWorld that can respond to events--it is just a pretty picture.
Let me expand on this, because what you say is not the whole picture ;-)
The Morphic event you cannot (yet) pass to the simulation came from somewhere. It actually bubbled up from within the real VM you're running. It started off as an OS event coming in through EventSensor>>primGetNextEvent: which is sent from fetchMoreEvents. From there EventSensor>>processEvent: created the Morphic event that you're having problems with.
Inside the real VM that OS event was responded to by the VM first calling ioProcessEvents from its event polling routine StackInterpreter>>ioProcessEvents, and then fetching the OS event via InterpreterPrimitives>>primitiveGetNextEvent, which implements EventSensor>>primGetNextEvent:. Inside primitiveGetNextEvent there's a call to ioGetNextEvent: which will answer the OS event that primitiveGetNextEvent answers as an Array.
Addressing the simulation, what is painting the picture is the VM simulator that sits behind the window. Within this VM there is a simulation of both ioProcessEvents and ioGetNextEvent:, but they do nothing:
StackInterpreterSimulator>>ioProcessEvents "do nothing..."
StackInterpreterSimulator>>ioGetNextEvent: evtBuf
self primitiveFail.
So your challenge is to take the Morphic event, queue it inside StackInterpreterSimulator (e.g. in a new inst var eventQueue), and convert it to an OS event (an Array), so that it can for example implement ioGetNextEvent: as something like
StackInterpreterSimulator>>ioGetNextEvent: evtBuf
eventQueue isEmpty ifTrue: [^self primitiveFail].
self convertMorphicEvent: eventQueue removeFirst into: evtBuf
HTH
That ladies and gentleman is the point where us fat and coddled coder
Hobbit's have to enter the dragon's cave, leaving the comfortable environment of the Squeak API for whatever strange and dangerous world lies below; Me? I decided to post here instead before venturing in (:
So, having learned this the hard way, there are a couple of strategies I could take--RFB, or maybe Nebraska-- to get events over to the target image but before doing that I have to ask is it really necessary?
My goal is to port the StackInterpeter to native 64 (and after that 64x64). Presumably Eliot managed to do the original work without the use of direct interaction with morphic on the running world, so shouldn't my task now be to emulate whatever techiques Eliot uses?
If so, that raises the final point. What exactly are those techniques? Specifically, given a running simulation, how does one produce a new VM? Do we boot up the target image on which we have developed and run VMMaker the standard way?
Finally, direct interaction with the runnig simulation world is desirable, I will be happy to implement it. Just please inform me of what strategy you would prefer and I will try to get it done.
Thank you for your time.
Eliot,
Thank you for the pointer in the right direction.
I will revisit that. That looks like fun.
My first foray into primitives....hmmmm
cordially,
tty.
---- On Wed, 22 Jan 2014 10:05:46 -0800 Eliot Miranda <eliot.miranda@gmail.com> wrote ----
Hi Tty,
On Wed, Jan 22, 2014 at 5:23 AM, gettimothy <gettimothy@zoho.com> wrote:
I have some questions on how to proceed at the bottom of this post.
In the host simulation environment, one cannot interact with the running simulation using the standard Morphic "event chain".
One cannot do so because from the host environment, there is no World in the simulation; there is only a painted picture of a World. The key to seeing this is to grok what a running simulation actually is.
StackInterpreterSimulator >> run
"..simulation setup code.."
[true] whileTrue: [self assertValidExecutionPointers. atEachStepBlock value. "N.B. may be nil" self dispatchOn: currentBytecode in: BytecodeTable. self incrementByteCount].
"..we never get here..."
There is no 'hook' for me to chain the Morphic event's down into, there are only bytecodes being fetched and executed . (which is totally cool and awesome, btw).
Those bytecodes paint a pretty picture on an ImageMorph placed on a Frame placed within a SystemWindow as seen here in
StackInterpreterSimulater >> openAsMorph
"...."
window addMorph: (displayView := ImageMorph new image: displayForm) frame: (0@0 corner: 1@0.8).
"...."
That image is not a PasteUpMorph named TheWorld that can respond to events--it is just a pretty picture.
Let me expand on this, because what you say is not the whole picture ;-)
The Morphic event you cannot (yet) pass to the simulation came from somewhere. It actually bubbled up from within the real VM you're running. It started off as an OS event coming in through EventSensor>>primGetNextEvent: which is sent from fetchMoreEvents. From there EventSensor>>processEvent: created the Morphic event that you're having problems with.
Inside the real VM that OS event was responded to by the VM first calling ioProcessEvents from its event polling routine StackInterpreter>>ioProcessEvents, and then fetching the OS event via InterpreterPrimitives>>primitiveGetNextEvent, which implements EventSensor>>primGetNextEvent:. Inside primitiveGetNextEvent there's a call to ioGetNextEvent: which will answer the OS event that primitiveGetNextEvent answers as an Array.
Addressing the simulation, what is painting the picture is the VM simulator that sits behind the window. Within this VM there is a simulation of both ioProcessEvents and ioGetNextEvent:, but they do nothing:
StackInterpreterSimulator>>ioProcessEvents
"do nothing..."
StackInterpreterSimulator>>ioGetNextEvent: evtBuf
self primitiveFail.
So your challenge is to take the Morphic event, queue it inside StackInterpreterSimulator (e.g. in a new inst var eventQueue), and convert it to an OS event (an Array), so that it can for example implement ioGetNextEvent: as something like
StackInterpreterSimulator>>ioGetNextEvent: evtBuf
eventQueue isEmpty ifTrue:
[^self primitiveFail].
self convertMorphicEvent: eventQueue removeFirst into: evtBuf
HTH
That ladies and gentleman is the point where us fat and coddled coder Hobbit's have to enter the dragon's cave, leaving the comfortable environment of the Squeak API for whatever strange and dangerous world lies below; Me? I decided to post here instead before venturing in (:
So, having learned this the hard way, there are a couple of strategies I could take--RFB, or maybe Nebraska-- to get events over to the target image but before doing that I have to ask is it really necessary?
My goal is to port the StackInterpeter to native 64 (and after that 64x64). Presumably Eliot managed to do the original work without the use of direct interaction with morphic on the running world, so shouldn't my task now be to emulate whatever techiques Eliot uses?
If so, that raises the final point. What exactly are those techniques? Specifically, given a running simulation, how does one produce a new VM? Do we boot up the target image on which we have developed and run VMMaker the standard way?
Finally, direct interaction with the runnig simulation world is desirable, I will be happy to implement it. Just please inform me of what strategy you would prefer and I will try to get it done.
Thank you for your time.
On 22 Jan 2014, at 7:05 , Eliot Miranda eliot.miranda@gmail.com wrote:
So your challenge is to take the Morphic event, queue it inside StackInterpreterSimulator (e.g. in a new inst var eventQueue), and convert it to an OS event (an Array), so that it can for example implement ioGetNextEvent: as something like
StackInterpreterSimulator>>ioGetNextEvent: evtBuf
eventQueue isEmpty ifTrue: [^self primitiveFail].
self convertMorphicEvent: eventQueue removeFirst into: evtBuf
As for the conversion into the OS array, searching for "Event Types” in sq.h will probably give a better introduction/faster understand as to the structure of said array than delving into the code doing array -> morphic event conversion in an image. Make sure you get WindowEvent type 6 correct! :)
Cheers, Henry
I don't think there is any reason to convert to an OS array - HandMorph>>processEvents already knows the OS event array that was used to generate the squeak event. Making that accessible may be all you need from the front end.
Cheers, Bob
On 1/23/14 4:43 AM, Henrik Johansen wrote:
On 22 Jan 2014, at 7:05 , Eliot Miranda <eliot.miranda@gmail.com mailto:eliot.miranda@gmail.com> wrote:
So your challenge is to take the Morphic event, queue it inside StackInterpreterSimulator (e.g. in a new inst var eventQueue), and convert it to an OS event (an Array), so that it can for example implement ioGetNextEvent: as something like
StackInterpreterSimulator>>ioGetNextEvent: evtBuf
eventQueue isEmpty ifTrue: [^self primitiveFail].
self convertMorphicEvent: eventQueue removeFirst into: evtBuf
As for the conversion into the OS array, searching for "Event Types” in sq.h will probably give a better introduction/faster understand as to the structure of said array than delving into the code doing array -> morphic event conversion in an image. Make sure you get WindowEvent type 6 correct! :)
Cheers, Henry
On 23 Jan 2014, at 12:32 , Bob Arning arning315@comcast.net wrote:
I don't think there is any reason to convert to an OS array - HandMorph>>processEvents already knows the OS event array that was used to generate the squeak event. Making that accessible may be all you need from the front end.
Cheers, Bob
There are multiple ways to Rome, each with their own advantages. What you describe would be simple, but probably require one to overriding the MorphicEvent definition/HandMorph code though. (Plus doing coordinate translation of the event buffer in the SimulatorMorph)
In a Pharo VMMaker-image, you could bypass morph event handling entirely without touching any core system code, by registering a custom InputEventHandler to the InputEventFetcher, which would do a short inside/active test for mouse/keyboard events respectively, and adding the array to the SimulatorMorphs event queue. (with coordinate translation as in the above approach)
Translating back from Morphic events is more work, true, but can be done with standalone code, and would work in both Pharo and Squeak.
Cheers, Henry
Hi Eliot, Bob and Henry.
Changeset and pic attached. (If the changeset has problems, let me know, I copied from my live CS to a clean copy to share that omits my fiddling around changes. I might have missed something)
I would appreciate your advice on how to proceed from here.
The approach is modeled after what I saw in HostWindowProxy. I have other things on my checklist to try, but this seemed the easiest approach, so I ran with it. My first goal was to just get the darned events into the simulator, see something on the simulated transcript and a menu to show up.
Modifications were as follows (changeset attached, btw)
EventSensor class side set class side accessor for instance var forwardVMSimulationEvents instance side add instance var forwardVMSimultioEvents modify processEvent:evt to forward raw events if flag is set.
StackInterpreterSimulator class side I added infrastructure to make it a Singleton Created a class side accesor to invoke the instance side method instance side. Added an eventQueue (SharedQueue) instance variable queueForwardedEvent lazy initialization takes event from EventSensor and dumps it on the eventQueue ioGetNextEvent checks eventQueue and copies data from the forward event to the CArrayAccessor There is commented out code I used to verify that 1. the argument evtBuf is a bunch of zeroes coming in 2. evtBuf matched the forwarded event after copy
Workspace code to launch the simulation:
| vm | Transcript clear.
EventSensor forwardVMSimulationEvents: true. vm := StackInterpreterSimulator newWithOptions: #(#STACKVM). vm openOn: '/home/wm/usr/src/smalltalk/buildCogDevelopmentImageCog.app/Contents/Resources/targets/Squeak4.5.image'. vm openAsMorph; run
"EventSensor forwardVMSimulationEvents: false."
Observations:
The events get there and the WorldMenu pops up. woot. (lower case "woot", no exclamation point. not to be confused with WOOT!!!!) The tildes "~" in the simulation Transcript (in the attached pic) are generated at StackInterpreterSimulator>> ioGetNextEvent Event processing is dog slow. Be prepared to wait 5 minutes for the menu to pop up. Tightly coupled. I don't like the modification to EventSensor. I don't like having to implement a Singleton--but it was saner than Smalltalk at: #StackInterpreterSimulatorLSB allInstances do:[...] I am unfamiliar with Smalltalk Semaphores, had I known how to do the Smalltalk equivalent of event listener I would have gone that route. Consumer/Producer Semaphore examples from various books I have read are all within the same process, so I cut my losses and went with the current approach.
EventSensor>>processEvent:evt does have the events in raw form, no need to translate before forwarding to the StackInterpreter.
In principle, we know we CAN get the dang events into the StackIntepreter.
How do I make this responsive and useful?
Thank you for your time
p.s. this is fun.
cheers.
tty.
Given that it is a simulation popping up the menu, how fast is it reasonable to expect? What is your normal simulated bytecodes/sec? How many bytecodes executed in that 5 minutes? Perhaps more importantly, how many simulated screen redraws in that interval? How expensive are these?
Cheers, Bob
On 1/30/14 1:50 PM, gettimothy wrote:
Event processing is dog slow. Be prepared to wait 5 minutes for
the menu to pop up. ...
How do I make this responsive and useful?
On Thu, Jan 30, 2014 at 11:20 AM, Bob Arning arning315@comcast.net wrote:
Given that it is a simulation popping up the menu, how fast is it reasonable to expect? What is your normal simulated bytecodes/sec? How many bytecodes executed in that 5 minutes? Perhaps more importantly, how many simulated screen redraws in that interval? How expensive are these?
Oh, good point! I think the simulator restricts redraws to every few thousand bytecodes. It would be much more responsive if somehow the simulator could update the screen whenever the bitblt primitive affects the screen.
Cheers, Bob
On 1/30/14 1:50 PM, gettimothy wrote:
Event processing is dog slow. Be prepared to wait 5 minutes for
the menu to pop up. ...
How do I make this responsive and useful?
There is a flush primitive which should be used.
http://isqueak.org/ioForceDisplayUpdate
Normally that is called by morphic. In cases of bad code design the osx and iOS version schedule a flush if needbe every n milliseconds. I think the windows version just respects the flush The x11 flavors just flush on each draw which make it slower
Sent from my iPhone
On Jan 30, 2014, at 2:24 PM, Eliot Miranda eliot.miranda@gmail.com wrote:
On Thu, Jan 30, 2014 at 11:20 AM, Bob Arning arning315@comcast.net wrote:
Given that it is a simulation popping up the menu, how fast is it reasonable to expect? What is your normal simulated bytecodes/sec? How many bytecodes executed in that 5 minutes? Perhaps more importantly, how many simulated screen redraws in that interval? How expensive are these?
Oh, good point! I think the simulator restricts redraws to every few thousand bytecodes. It would be much more responsive if somehow the simulator could update the screen whenever the bitblt primitive affects the screen.
Cheers, Bob
On 1/30/14 1:50 PM, gettimothy wrote: Event processing is dog slow. Be prepared to wait 5 minutes for the menu to pop up. ...
How do I make this responsive and useful?
-- best, Eliot
Hi Tty!
On Thu, Jan 30, 2014 at 10:50 AM, gettimothy gettimothy@zoho.com wrote:
Hi Eliot, Bob and Henry.
Changeset and pic attached. (If the changeset has problems, let me know, I copied from my live CS to a clean copy to share that omits my fiddling around changes. I might have missed something)
congratulations, and *thank you*!
I would appreciate your advice on how to proceed from here.
see below.
The approach is modeled after what I saw in HostWindowProxy. I have other things on my checklist to try, but this seemed the easiest approach, so I ran with it. My first goal was to just get the darned events into the simulator, see something on the simulated transcript and a menu to show up.
Modifications were as follows (changeset attached, btw)
EventSensor class side set class side accessor for instance var forwardVMSimulationEvents instance side add instance var forwardVMSimultioEvents modify processEvent:evt to forward raw events if flag is set.
StackInterpreterSimulator class side I added infrastructure to make it a Singleton Created a class side accesor to invoke the instance side method instance side. Added an eventQueue (SharedQueue) instance variable queueForwardedEvent lazy initialization takes event from EventSensor and dumps it on the eventQueue ioGetNextEvent checks eventQueue and copies data from the forward event to the CArrayAccessor There is commented out code I used to verify that 1. the argument evtBuf is a bunch of zeroes coming in 2. evtBuf matched the forwarded event after copy
Workspace code to launch the simulation:
| vm | Transcript clear. EventSensor forwardVMSimulationEvents: true. vm := StackInterpreterSimulator newWithOptions: #(#STACKVM). vm openOn:
'/home/wm/usr/src/smalltalk/buildCogDevelopmentImageCog.app/Contents/Resources/targets/Squeak4.5.image'. vm openAsMorph; run
"EventSensor forwardVMSimulationEvents: false."
Observations:
The events get there and the WorldMenu pops up. woot. (lower case
"woot", no exclamation point. not to be confused with WOOT!!!!) The tildes "~" in the simulation Transcript (in the attached pic) are generated at StackInterpreterSimulator>> ioGetNextEvent Event processing is dog slow. Be prepared to wait 5 minutes for the menu to pop up.
That's just a fact of life. The simulator is dog slow :-).
Tightly coupled. I don't like the modification to EventSensor.
Agreed. TO proceed, the next step would be to try and eliominate this modification. Isn't there a way of making the change self-contaned by trying to confine the set-up of the plumbing to openAsMorph?
I don't like having to implement a Singleton--but it was saner
than Smalltalk at: #StackInterpreterSimulatorLSB allInstances do:[...] I am unfamiliar with Smalltalk Semaphores, had I known how to do the Smalltalk equivalent of event listener I would have gone that route. Consumer/Producer Semaphore examples from various books I have read are all within the same process, so I cut my losses and went with the current approach.
EventSensor>>processEvent:evt does have the events in raw form, no
need to translate before forwarding to the StackInterpreter.
Right.
In principle, we know we CAN get the dang events into the
StackIntepreter.
Which is fab. Again, thanks!
How do I make this responsive and useful?
Three things? 1. figure out how to do it without modifying base packages, confining changes to VMMaker. 2. ask David Lewis for write permission to the VMMaker repository. 3. commit your changed version of VMMaker, from which I can merge :-)
Thank you for your time
Au contraire ;-)
p.s. this is fun.
+1
cheers.
tty.
On 30.01.2014, at 20:22, Eliot Miranda eliot.miranda@gmail.com wrote:
Tightly coupled. I don't like the modification to EventSensor.
Agreed. TO proceed, the next step would be to try and eliominate this modification. Isn't there a way of making the change self-contaned by trying to confine the set-up of the plumbing to openAsMorph?
The right way would be to just use the regular morphic event processing and generate VM events from those. I don't see why you would need to hook into EventSensor.
- Bert -
Thank you all for your input. Very useful and appreciated.
Below are your recommendations/comments (in bold) as I have summarized them. My notes to myself on your recommendations/comments are italicized.
FWIW, my plan is to attempt the Semaphore route per Henrik first, followed by the optimizations suggested by John, Bob and Eliot If I hit the wall on that, try the Morphic approach per Bert's recommendation followed by the optimizations.
Check out Pharo InputEventFetcher (Henrik) There's already such a Semaphore operating across the VM/image divide, set by primitive 93. You can see an example of how to do an event listener using that in a Pharo images InputEventFetcher ;) Awesome. Thank you so much.
filtering out MouseOver events would cut down on the response time? (Henrik) do this first as it is an easy implement in EventSensor.
Event translation needed? (Henrik) I don't know. My hunch is that it will need to be translated. Translation should be straight-forward when/if I decide to implement it.
Use regular Morphic Event processing (Bert) ok, aren't Morphic events constrained to flow along the Morphic component tree ? (Contrasted with the AXAnnouncements framework where one can "Announcer annouce" an event) If so, that would imply sub-classing the ImageMorph that the Simulator paints the world on where I can implement the forwarding/filtering/translation functionality. Bear in mind, I am a rookie at all but Seaside dev (and I am rusty at that (: ) If Morphic events are not constrained to flow along the Morphic tree, where is an example of a "Morphic event broadcast" I can forward to the Simulator?
Flush Primitive investigate at http://isqueak.org/ioForceDisplayUpdate (John M.) Normally that is called by Morphic. In cases of bad code design the osx and iOS version schedule a flush if needbe every n milliseconds. curious what triggers that "need be" The x11 flavors just flush on each draw which make it slower Very informative maybe I could toggle the flush rate on Morphic "focus" or mouseIn/mouseOut events. If I go non-morphic route, how to do this...hmmm
It would be much more responsive if somehow the simulator could update the screen whenever the bitblt primitive affects the screen. (Eliot) John M. provides the roadmap for this above.
Ditch the change to EventSensor (Eliot) make the change self-contained by trying to confine the set-up of the plumbing to openAsMorph? Yes. The only "Morphic" avenue I see is subclassing ImageMorph. If it's Henrik's listener approach, then definitely yes (to self: consider separation of concerns after it works).
Given that it is a simulation popping up the menu, how fast is it reasonable to expect? (Bob) As fast as other simulated output seems to be a desirable goal. What is your normal simulated bytecodes/sec? I don't know how to measure this. Is there a Smalltalk equivalent of the Linux "time" command? How many bytecodes executed in that 5 minutes? or how to measure this, however, filtering mouse move should help a bit how many simulated screen redraws in that interval? Maybe here is an idea to stop screen redraws on loss of focus or by direct command, or by setting a rate (I like this idea) on Simulator Launch or per the command pop-up window (this would be cool) How expensive are these? I don't know how to measure this Three things (Eliot) 1. figure out how to do it without modifying base packages, confining changes to VMMaker. 2. ask David Lewis for write permission to the VMMaker repository. 3. commit your changed version of VMMaker, from which I can merge :-) I will do 2 and 3 when we all agree the approach is sane
Finally, I am approaching this with two pre-conceptions that may be wrong. (My Smalltalk work has been primarily Seaside, so my non-Seaside repertoire is limited)
1. Morphic events only traverse the Morphic tree and are not "broadcast" for "listeners" to pick up. This will necessitate subclassing the ImageMorph the Simulator uses. 2. Semaphores are a 1-to-1 pipe and not a 1-to-many pipe--hence I cannot just tap into an existing Semaphore (Henrik's statement contradicts me here. I will investigate).
Thanks again for your time.
Cordially,
tty
See HandMorph>>addEventListener:
On 1/31/14 12:05 PM, gettimothy wrote:
- Morphic events only traverse the Morphic tree and are not
"broadcast" for "listeners" to pick up. This will necessitate subclassing the ImageMorph the Simulator uses.
The flush or display sync primitive is driven by morphic so it bundles a set if morphic draw commands. I think the box you get when you resize a window frame doesn't do the sync You could nil out the prim call and observe the behavior on a mac or iOS
Sent from my iPhone
On Jan 31, 2014, at 12:05 PM, gettimothy gettimothy@zoho.com wrote:
curious what triggers that "need be"
On 31.01.2014, at 18:05, gettimothy gettimothy@zoho.com wrote:
Use regular Morphic Event processing (Bert) ok, aren't Morphic events constrained to flow along the Morphic component tree ?
Yes, and that's exactly what you want. Clicking or typing outside the simulator window should not create a simulation event.
If so, that would imply sub-classing the ImageMorph that the Simulator paints the world on where I can implement the forwarding/filtering/translation functionality.
Not necessarily. There should be a SimulatorMorph that holds onto the simulator itself and embeds the image morph. That could present a nice UI, and could either handle events itself, or register event handlers with the image morph. If you need UI ideas I suggest you try my SqueakJS morph (which essintially is also a VM simulator): http://lively-web.org/users/bert/squeak.html It's a shame we never bothered to make the simulator this nice ;)
- Bert -
On 31-01-2014, at 2:24 PM, Bert Freudenberg bert@freudenbergs.de wrote:
Not necessarily. There should be a SimulatorMorph that holds onto the simulator itself and embeds the image morph. That could present a nice UI, and could either handle events itself, or register event handlers with the image morph.
A useful UI for the simulator would allow pushing in events that are *not* those going on in the outer world; some way to send in a variety of keyboard/button events would help with making sure that the event handling can cope with variations not provided by your host. For example, some platforms send a triple of keydown/keystroke/keyup for each keyboard press. Some do it quite differently. Or say one is trying to develop handling for a new event that isn’t actually provided by any VM yet (OS_TeaTime_Alert, for example) and you want to make sure it can work.
At the very least it might lead to some decent documentation of WTF goes on down there.
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim "How many Slavers does it take to change a lightbulb?" "Dunno. How susceptible are lightbulbs to telepathy?"
On Fri, Jan 31, 2014 at 11:24:33PM +0100, Bert Freudenberg wrote:
On 31.01.2014, at 18:05, gettimothy gettimothy@zoho.com wrote:
Use regular Morphic Event processing (Bert) ok, aren't Morphic events constrained to flow along the Morphic component tree ?
Yes, and that's exactly what you want. Clicking or typing outside the simulator window should not create a simulation event.
If so, that would imply sub-classing the ImageMorph that the Simulator paints the world on where I can implement the forwarding/filtering/translation functionality.
Not necessarily. There should be a SimulatorMorph that holds onto the simulator itself and embeds the image morph. That could present a nice UI, and could either handle events itself, or register event handlers with the image morph. If you need UI ideas I suggest you try my SqueakJS morph (which essintially is also a VM simulator): http://lively-web.org/users/bert/squeak.html It's a shame we never bothered to make the simulator this nice ;)
+1
It would be great if someone were to take inspiration from Bert's SqueakJS morph and do something similar for the traditional VM simulators.
Personally, it's inspiring me to want to learn about JS some day soon. That lively VM is seriously impressive.
Dave
On 01.02.2014, at 00:15, David T. Lewis lewis@mail.msen.com wrote:
On Fri, Jan 31, 2014 at 11:24:33PM +0100, Bert Freudenberg wrote:
There should be a SimulatorMorph that holds onto the simulator itself and embeds the image morph. That could present a nice UI, and could either handle events itself, or register event handlers with the image morph. If you need UI ideas I suggest you try my SqueakJS morph (which essintially is also a VM simulator): http://lively-web.org/users/bert/squeak.html It's a shame we never bothered to make the simulator this nice ;)
+1
It would be great if someone were to take inspiration from Bert's SqueakJS morph and do something similar for the traditional VM simulators.
Personally, it's inspiring me to want to learn about JS some day soon. That lively VM is seriously impressive.
Dave
Thanks :) I must give credit where credit is due: This was a lot easier to do in Lively than if I had done it in vanilla HTML+Javascript. Almost as easy as if I had done it in Squeak ;)
- Bert -
Hi Bert.
Looking at your work at http://lively-web.org/users/bert/squeak.html, in the top middle display Am I looking at a process browser and a "stack" of suspended and active contexts?
[] in BlockContext>>newProcess [] in ProcessorScheduler class>>startUp ProcessorScheduler class>>idleProcess
ctx[5]=rcvr: the ProcessorScheduler class
thx,
tty
On 02.02.2014, at 17:59, gettimothy gettimothy@zoho.com wrote:
Hi Bert.
Looking at your work at http://lively-web.org/users/bert/squeak.html, in the top middle display Am I looking at a process browser and a "stack" of suspended and active contexts?
[] in BlockContext>>newProcess [] in ProcessorScheduler class>>startUp ProcessorScheduler class>>idleProcess
ctx[5]=rcvr: the ProcessorScheduler class
Almost. It is the stack of the currently running process. That is, the activeContext and its sender context and its sender context etc.
You can see the same thing, just displayed differently, when you follow the activeContext's sender chain in the left tree display.
The other runnable processes are in the Processor global, the 4th item in the specialObjects array on the left. This image has 8 priority levels. It's more instructive to look at it right after loading, because then you will see the idleProcess at priority 1 in the quiescentProcessList. The idleProcess is always runnable. So if no higher-priority process is runnable (which is most of the time) this is what you see. The other processes are all waiting on some Semaphore so they are not in that list.
- Bert -
Excellent.
Thank you.
tty.
---- On Sun, 02 Feb 2014 09:56:41 -0800 Bert Freudenberg<bert@freudenbergs.de> wrote ----
On 02.02.2014, at 17:59, gettimothy <gettimothy@zoho.com> wrote:
Hi Bert.
Looking at your work at http://lively-web.org/users/bert/squeak.html, in the top middle display Am I looking at a process browser and a "stack" of suspended and active contexts?
[] in BlockContext>>newProcess [] in ProcessorScheduler class>>startUp ProcessorScheduler class>>idleProcess
ctx[5]=rcvr: the ProcessorScheduler class
Almost. It is the stack of the currently running process. That is, the activeContext and its sender context and its sender context etc.
You can see the same thing, just displayed differently, when you follow the activeContext's sender chain in the left tree display.
The other runnable processes are in the Processor global, the 4th item in the specialObjects array on the left. This image has 8 priority levels. It's more instructive to look at it right after loading, because then you will see the idleProcess at priority 1 in the quiescentProcessList. The idleProcess is always runnable. So if no higher-priority process is runnable (which is most of the time) this is what you see. The other processes are all waiting on some Semaphore so they are not in that list.
- Bert -
See attached screen shot.
Hi All,
Quick progress report.
1. I created a mockup of Bert's SqueakJS. Its stubbed out, but the run and help button work. If anybody can tell me where to pull the current context from the vm sim, I can replace the bottom middle panel with that. It currently uses: SimpleHierarchicalListMorph on: [ Array with: (ObjectExplorerWrapper with: (self model vm) name: 'root' model: (self model vm) parent: nil) ] where 'vm' is the running StackInterpreterSimulator.
So something like: SimpleHierarchicalListMorph on: [ Array with: (ObjectExplorerWrapper with: (self model vm getTheActiveContext) name: 'root' model: (self model vm somethingOrOtherToHangTheActiveContextUnder) parent: nil) ]
would be appreciated if its easy.
2. Ah, events, events, events...how do I know thee...let me count the ways.
Standard Morphic events are created by HandMorph>>processEvents when invoked by WorldState doOneCycleNowFor:
HandMorph grabs the raw evtBuf array off the Sensor (exactly as I forwarded in round one of this) and then wraps them to make them handy for Morphic. Unfortunately what is handy for Morphic is not handy for the VM Simulator.
2.a. The obvious solution is to reverse the wrapping. I do not see any existing work that does this, and it looks a bit daunting. My hunch is that I will have to do it. At that point, you can ridicule my lack of low-level bit fiddling experience.
2.a.1 Unfortunately, HandMorph's addEventListener/sendListenEvent/handleListenEvent event broadcasting framework sends these wrapped events.
2.b. I modified HandMorph and added a parallel event broadcasting/listener framework with addPrimitiveEventListener/sendListenPrimitiveEvent/handleListenPrimitiveEvent and that works just fine. I am able to subscribe to the primitive events from my Morph's model and forward them to the vm simulator--just like with my previous approach with EventSensor.
2.b.1 I am going to continue with this for a bit, as next up, I think I can improve performance by injecting only those events that fall within the bounds of the ImageMorph that houses the simulated World. I will be on that tomorrow.
2.c. Tim R. requested a means to feed events directly to the simulator. The eventListener framework in HandMorph can be ported to the GUI in 10 minutes, so there is hope there.
3. After 2.b.1, I am going to see about injecting Morphic Events that only occur on the ImageMorph that houses the simulated World. I started with this approach initially, but chickened out when I saw the wrapping task and went with the 2 series instead.
I will try to get a clean copy of my code to you all in a couple of days. I have a lot of missteps in my work due to the various approaches I am trying, so its a bit messy at the moment.
Anyway, that's the update.
cordially,
tty
Hi Tty,
On Tue, Feb 4, 2014 at 2:42 PM, gettimothy gettimothy@zoho.com wrote:
See attached screen shot.
Looking good!
Hi All,
Quick progress report.
- I created a mockup of Bert's SqueakJS. Its stubbed out, but the run and
help button work. If anybody can tell me where to pull the current context from the vm sim, I can replace the bottom middle panel with that. It currently uses:
SimpleHierarchicalListMorph on: [ Array with: (ObjectExplorerWrapper with: (self model vm) name: 'root' model: (self model vm) parent: nil) ] where 'vm' is the running StackInterpreterSimulator.
So something like:
SimpleHierarchicalListMorph on: [ Array with: (ObjectExplorerWrapper with: (self model vm getTheActiveContext) name: 'root' model: (self model vm somethingOrOtherToHangTheActiveContextUnder) parent: nil) ]
would be appreciated if its easy.
There is no active context in the StackInterpreter except at snapshot load or save. There is an active stack frame almost always, but sometimes it is defined by the localFP,localSP inst vars of StackInterpreter (when executing bytecodes) and sometimes by the framePointer,stackPointer inst vars of InterpreterPrimitives (when executing non-inlined primitives).
In the Cog VM it's even more complicated. Sometimes the VM is in the interpreter (frame is localFP,localSP or framePointer,stackPointer as above), sometimes in machine-code, when the current frame is either the machine's frame pointer,stack pointer pair (e.g. %ebp,%esp) or framePointer,stackPointer (when machine-code calls interpreter primitives).
Knowing which state the VM is in is non-trivial. So I don't know what to do here. In the old sim window you'll see the context menu has options for printing the stack in each of these states.
- Ah, events, events, events...how do I know thee...let me count the ways.
Standard Morphic events are created by HandMorph>>processEvents when invoked by WorldState doOneCycleNowFor:
HandMorph grabs the raw evtBuf array off the Sensor (exactly as I forwarded in round one of this) and then wraps them to make them handy for Morphic. Unfortunately what is handy for Morphic is not handy for the VM Simulator.
2.a. The obvious solution is to reverse the wrapping. I do not see any existing work that does this, and it looks a bit daunting. My hunch is that I will have to do it. At that point, you can ridicule my lack of low-level bit fiddling experience.
:-)
2.a.1 Unfortunately, HandMorph's addEventListener/sendListenEvent/handleListenEvent event broadcasting framework sends these wrapped events.
2.b. I modified HandMorph and added a parallel event broadcasting/listener framework with addPrimitiveEventListener/sendListenPrimitiveEvent/handleListenPrimitiveEvent and that works just fine. I am able to subscribe to the primitive events from my Morph's model and forward them to the vm simulator--just like with my previous approach with EventSensor.
2.b.1 I am going to continue with this for a bit, as next up, I think I can improve performance by injecting only those events that fall within the bounds of the ImageMorph that houses the simulated World. I will be on that tomorrow.
2.c. Tim R. requested a means to feed events directly to the simulator. The eventListener framework in HandMorph can be ported to the GUI in 10 minutes, so there is hope there.
- After 2.b.1, I am going to see about injecting Morphic Events that only
occur on the ImageMorph that houses the simulated World. I started with this approach initially, but chickened out when I saw the wrapping task and went with the 2 series instead.
I will try to get a clean copy of my code to you all in a couple of days. I have a lot of missteps in my work due to the various approaches I am trying, so its a bit messy at the moment.
Take missteps as a sign that you're effectively exploring the solution space. Anyway, this is great! Thanks!
Anyway, that's the update.
cordially,
tty
Hi All,
I have some primitive event filtering in place and was able to open a Workspace and type a message. See the attached screenshot.
I figure my work is at the stage where it would benefit from other eyes and minds.
For testing, I merged the attached changeset into 4.5 release candidate 3 after installing the CogDev stuff here: http://www.squeakvm.org/svn/squeak/branches/Cog/image/Workspace.text
I copied over the changes and it worked fine. One caveat is that I have the bug-fix for WriteStream nextChunkPut: in that change set. you probably don't want that.
To launch the thing, see StackInterpreterSimlatorMorph class > documentation protocol for examples on launching (currently 1 example (:)
Below are some notes.
Category and source 1.added category VMMaker-InterpreterSimulation-Morphic which contains 3 classes StackInterpreterSimulatorImageMorph StackInterpreterSimulatorMorph StackInterpreterSimulatorMorphicModel EventSensorConstants is added as a poolDictionary to the model
2. Additions where made to HandMorph. Most are in category *VM.oscog except for one edit to HandMorph>>processEvents
3. Some modifications to StackInterpreterSimulator.
4. I think that's all.
Event forwarding
1. What I call "primitive events" are much easier to use (for me) than un-wrapping morphic events.
2. I modified HandMorph by adding a primitiveEventListener framework that mimics the existing HandMorph >> addEventListener:/sendListenEvent:to:/handleListenEvent: framwork.
See HandMorph >> addPrimitiveEventListener:/sendListenPrimitiveEvent:to:/handleListenPrimitiveEvent.
For morphic events, listener notification happens in HandMorph >> handleEvent. For the primitiveEvents, I had to use HandMorph >> processEvents.
3. If this is un-acceptable, then the task of 'decoding' the Morphic Events that HandMorph creates in HandMorph >> processEvents needs to be done. I prefer to not do it as it is ugly.
4. Primitive events from HandMorph are passed to StackInterpreterSimulatorMorphicModel after being registered with HandMorph. 1. The StackIterpreterSimulatorMorph >> displayView registers the model with the HandMorph for event notification.
5. StackInterpreterSimulatorMorphicModel >> handleListenPrimitiveEvent: evtBuf is where event translation and filtering happens. 1. This needs a deft hand. Preferrably by somebody who is familiar with Morhic bounds translations. I briefly looked at it and its a deep subject, so I ran away screaming. 2. You can un-comment some code to see the raw events on the transcript. I found this handy. 3. EventSensorConstants has a list of other events that you may want to forward.
6. Event's are queued in StackInterpreterSimulatorMorphicModel >> handleListenPrimitiveEvent: and dequeued in StackInterpreterSimulator >> ioGetNextEvent.
Tests 1. I have not written any yet. 2. The morph is copied from PreferenceBrowser, so I guess I need to look at those tests.
StackInterpreterSimulator notes.
1. Rule 1 is to not break the existing openAsMorph functionality Eliot has.
2. StackInterpreterSimulator >> openAsStackInterpreterSimulatorMorph launches it. (see StackIntepreterSimulatorMorph class ttyOne for example script)
3. StackInterpreterSimulator is the instance variable 'vm' on the morphic model.
4. There is tight-coupling of the StackInterpreterSimulator's displayView variable across all three layers (Morph-Model-VM) I had a good reason for this, but I forget what it was.
5. StackIntepreterSimulator >> incrementByteCount is what triggers the display redraw. This needs attention. I changed the count to 1000 from 10,000 to trigger the forceDisplayUpdate call and got some improvement. Things I am wondering 1. Should we change these triggers to options on launch? 2. Should we make them variable? 3. We can force a fullDisplayUpdate from the GUI, (Which I tried). 4. Instead of a byteCount triggering it, what else would make sense?
Morphic improvments needed 1. Event bound translation in StackInterpreterSimulatorMorphicModel>> handleListenPrimitiveEvent I looked at this briefly and its a deep subject, so I took the easy way out and reverted to addition on x/y coordinates. (:
2. Mouse and Keyboard events are forwarded when mouse is within the bounds of the ImageMorph.
3. z-order does not stop events from being forwarded. For example,but a Browser over the simulation and events are still forwarded.
4. Mouse drag moves the entire simulation window--I cannot drag a window within the simulation.
5. Splitters and scroll-bars, splitters and scroll-bars (my first morph attempt, sorry)
6. Buttons and callbacks.
7. Bert's SqueakJS windows along the bottom--what makes sense to put there? How to do it?
8. See class commments in StackInterpreterSimulatorMorph
9. Help button contents
10. Tim R's "WTF goes on in there?" button has not been implemented. Its very doable; events just need to be enqueued for the simulator to suck them up.
11. Eliot's command window menu needs testing. I know the clone vm does not work in this simulation. (it does work in the original openAsMorph simulation)
Summary 1. It's usable enough now that I think submitting it is a good idea
2. I expect to improve it as I learn more
3. I would like to move on towards my goal of porting the StackInterpreterSimulator to native 64.
cheers.
tty
This is impressive!
Btw, we should take the Morphic discussion to squeak-dev. It's pretty much irrelevant to vm-dev.
On 06.02.2014, at 17:36, gettimothy gettimothy@zoho.com wrote:
Event forwarding
1. What I call "primitive events" are much easier to use (for me) than un-wrapping morphic events.
They're not. All the pain you're going through (particularly in your point 5 below) is because you're trying to somehow preserve low-level events.
2. I modified HandMorph by adding a primitiveEventListener framework that mimics the existing HandMorph >> addEventListener:/sendListenEvent:to:/handleListenEvent: framwork. See HandMorph >> addPrimitiveEventListener:/sendListenPrimitiveEvent:to:/handleListenPrimitiveEvent. For morphic events, listener notification happens in HandMorph >> handleEvent. For the primitiveEvents, I had to use HandMorph >> processEvents.
Brave, yet completely unnecessary ;)
3. If this is un-acceptable, then the task of 'decoding' the Morphic Events that HandMorph creates in HandMorph >> processEvents needs to be done. I prefer to not do it as it is ugly.
It will be much easier if you don't think of this as "decoding" to "get back the original events". Forget that your simulator is running on a Squeak VM. Forget that at some point there were low-level events. This is completely irrelevant and just distracting. Think of the morphic events that your Morph receives as the primary input source. *Generate* your own VM events from these morphic events. Much simpler and much more clear.
4. Primitive events from HandMorph are passed to StackInterpreterSimulatorMorphicModel after being registered with HandMorph. 1. The StackIterpreterSimulatorMorph >> displayView registers the model with the HandMorph for event notification. 5. StackInterpreterSimulatorMorphicModel >> handleListenPrimitiveEvent: evtBuf is where event translation and filtering happens. 1. This needs a deft hand. Preferrably by somebody who is familiar with Morhic bounds translations. I briefly looked at it and its a deep subject, so I ran away screaming. 2. You can un-comment some code to see the raw events on the transcript. I found this handy. 3. EventSensorConstants has a list of other events that you may want to forward.
The problem here is that you are trying to circumvent morphic event handling, rather than using it.
6. Event's are queued in StackInterpreterSimulatorMorphicModel >> handleListenPrimitiveEvent: and dequeued in StackInterpreterSimulator >> ioGetNextEvent.
Maybe this helps: Your host VM may or may not run using the event primitives. It might use the polling primitives. Morphic runs fine on both. So in some cases it's not even *possible* to use the "original VM events".
- Bert -
Hi Bert.
Thanks for the compliment and the pointers.
I agree that the mods to HandMorph need to go away and the existing functionality should be used. That means using the existing HandMorph>>addEventListener and generating the event format that StackInteprterSimulator >>ioGetNextEvent: evtBuf expects in the morph's model.
Unfortunately, I found that step daunting. Basically, it was a time-and-energy decision.
Since I already knew that the StackIntepreterSimulator would respond to EventSensor events, I decided to eliminate a possible source of error by going with a known working format. That allowed me to focus on filtering and transforming (such as they are). When I get time, I will work on what you suggest as I agree with you.
However, IF there is some Morphic talent out there that can do this stuff in their sleep, then I hope they will take up the task. This is my first Morphic attempt and it is sure to have some ugly stuff in it. I would prefer to have a pro do it right.
>>Maybe this helps: Your host VM may or may not run using the event primitives. It might use the polling primitives. Morphic runs fine on both. So in some cases it's not even *possible* to use the "original VM events".
That sound interesting! I never heard of this distinction in events. What are they?
Thank you again.
cordially,
tty
On Thu, Feb 6, 2014 at 8:36 AM, gettimothy gettimothy@zoho.com wrote:
Hi All,
I have some primitive event filtering in place and was able to open a Workspace and type a message. See the attached screenshot.
Awesome!! Thank you so much!
I figure my work is at the stage where it would benefit from other eyes and minds.
For testing, I merged the attached changeset into 4.5 release candidate 3 after installing the CogDev stuff here: http://www.squeakvm.org/svn/squeak/branches/Cog/image/Workspace.text
I copied over the changes and it worked fine. One caveat is that I have the bug-fix for WriteStream nextChunkPut: in that change set. you probably don't want that.
To launch the thing, see StackInterpreterSimlatorMorph class > documentation protocol for examples on launching (currently 1 example (:)
Below are some notes.
*Category and source* 1.added category VMMaker-InterpreterSimulation-Morphic which contains 3 classes StackInterpreterSimulatorImageMorph StackInterpreterSimulatorMorph StackInterpreterSimulatorMorphicModel EventSensorConstants is added as a poolDictionary to the model
- Additions where made to HandMorph. Most are in category *VM.oscog
except for one edit to HandMorph>>processEvents
Some modifications to StackInterpreterSimulator.
I think that's all.
Can you e.g. send me your VMMaker package from your local package-cache? And can you send me the hand edit to HandMorph>>processEvents?
Looks like Bert is providing useful feedback. I'm delighted to have this working!!
*Event forwarding*
1. What I call "primitive events" are much easier to use (for me) than
un-wrapping morphic events.
2. I modified HandMorph by adding a primitiveEventListener framework
that mimics the existing HandMorph >> addEventListener:/sendListenEvent:to:/handleListenEvent: framwork.
See HandMorph >>
addPrimitiveEventListener:/sendListenPrimitiveEvent:to:/handleListenPrimitiveEvent.
For morphic events, listener notification happens in HandMorph >>
handleEvent. For the primitiveEvents, I had to use HandMorph >> processEvents.
3. If this is un-acceptable, then the task of 'decoding' the Morphic
Events that HandMorph creates in HandMorph >> processEvents needs to be done. I prefer to not do it as it is ugly.
4. Primitive events from HandMorph are passed to
StackInterpreterSimulatorMorphicModel after being registered with HandMorph. 1. The StackIterpreterSimulatorMorph >> displayView registers the model with the HandMorph for event notification.
5. StackInterpreterSimulatorMorphicModel >>
handleListenPrimitiveEvent: evtBuf is where event translation and filtering happens. 1. This needs a deft hand. Preferrably by somebody who is familiar with Morhic bounds translations. I briefly looked at it and its a deep subject, so I ran away screaming. 2. You can un-comment some code to see the raw events on the transcript. I found this handy. 3. EventSensorConstants has a list of other events that you may want to forward.
6. Event's are queued in StackInterpreterSimulatorMorphicModel >>
handleListenPrimitiveEvent: and dequeued in StackInterpreterSimulator >> ioGetNextEvent.
*Tests* 1. I have not written any yet. 2. The morph is copied from PreferenceBrowser, so I guess I need to look at those tests.
*StackInterpreterSimulator notes.*
1. Rule 1 is to not break the existing openAsMorph functionality Eliot
has.
2. StackInterpreterSimulator >> openAsStackInterpreterSimulatorMorph
launches it. (see StackIntepreterSimulatorMorph class ttyOne for example script)
3. StackInterpreterSimulator is the instance variable 'vm' on the
morphic model.
4. There is tight-coupling of the StackInterpreterSimulator's
displayView variable across all three layers (Morph-Model-VM) I had a good reason for this, but I forget what it was.
5. StackIntepreterSimulator >> incrementByteCount is what triggers the
display redraw. *This needs attention. * I changed the count to 1000 from 10,000 to trigger the forceDisplayUpdate call and got some improvement. Things I am wondering 1. Should we change these triggers to options on launch? 2. Should we make them variable? 3. We can force a fullDisplayUpdate from the GUI, (Which I tried). 4. Instead of a byteCount triggering it, what else would make sense?
I think getting the BitBltPlugin simulator or the SurfacePlugin, where ever the code is that in the real VM triggers an update of the real screen, is the right place to get the simulator to update the image morph.
*Morphic improvments needed*
1. Event bound translation in StackInterpreterSimulatorMorphicModel>>
handleListenPrimitiveEvent I looked at this briefly and its a deep subject, so I took the easy way out and reverted to addition on x/y coordinates. (:
2. Mouse and Keyboard events are forwarded when mouse is within the
bounds of the ImageMorph.
3. z-order does not stop events from being forwarded. For example,but
a Browser over the simulation and events are still forwarded.
4. Mouse drag moves the entire simulation window--I cannot drag a
window within the simulation.
5. Splitters and scroll-bars, splitters and scroll-bars (my first
morph attempt, sorry)
6. Buttons and callbacks. 7. Bert's SqueakJS windows along the bottom--what makes sense to put
there? How to do it?
8. See class commments in StackInterpreterSimulatorMorph 9. Help button contents 10. Tim R's "WTF goes on in there?" button has not been implemented.
Its very doable; events just need to be enqueued for the simulator to suck them up.
11. Eliot's command window menu needs testing. I know the clone vm
does not work in this simulation. (it does work in the original openAsMorph simulation)
*Summary* 1. It's usable enough now that I think submitting it is a good idea
2. I expect to improve it as I learn more 3. I would like to move on towards my goal of porting the
StackInterpreterSimulator to native 64.
cheers.
tty
Nice proof of concept!
On 30 Jan 2014, at 7:50 , gettimothy gettimothy@zoho.com wrote:
Hi Eliot, Bob and Henry.
Changeset and pic attached. (If the changeset has problems, let me know, I copied from my live CS to a clean copy to share that omits my fiddling around changes. I might have missed something)
I would appreciate your advice on how to proceed from here.
The approach is modeled after what I saw in HostWindowProxy. I have other things on my checklist to try, but this seemed the easiest approach, so I ran with it. My first goal was to just get the darned events into the simulator, see something on the simulated transcript and a menu to show up.
Observations:
The events get there and the WorldMenu pops up. woot. (lower case "woot", no exclamation point. not to be confused with WOOT!!!!) The tildes "~" in the simulation Transcript (in the attached pic) are generated at StackInterpreterSimulator>> ioGetNextEvent Event processing is dog slow. Be prepared to wait 5 minutes for the menu to pop up.
Was event coordinate translation somehow not needed, or did a click in top left of your screenshot end up opening the menu on mid right side?
Tightly coupled. I don't like the modification to EventSensor. I don't like having to implement a Singleton--but it was saner than Smalltalk at: #StackInterpreterSimulatorLSB allInstances do:[…]
IMHO, If you just keep the EventSensor override code dealing with event delivery to the morph in a separate package, it would be just fine to delay exchange to a less intrusive alternative until later. As long as the responsibilities are clear, it shouldn’t be too hard (push vs pull expected, who of deliverer/simulator morph does potentially needed coord translation, etc)
I am unfamiliar with Smalltalk Semaphores, had I known how to do the Smalltalk equivalent of event listener I would have gone that route. Consumer/Producer Semaphore examples from various books I have read are all within the same process, so I cut my losses and went with the current approach.
There’s already such a Semaphore operating across the VM/image divide, set by primitive 93. You can see an example of how to do an event listener using that in a Pharo images InputEventFetcher ;) (whose consumer action is… pushing the event to an in-image queue :P)
Related, the morphic polling loop in Pharo images has been decoupled from the raw act of calling getNextEvent:, so for those correct signaling of the semaphore set by primitive 93 is required to get input processing simulated correctly . Shouldn’t be harder to make it work than signaling said semaphore from ioProcessEvents if the eventQueue is not empty, provided the semaphore setter primitive is actually operational in the Simulator :P
Cheers, Henry
EventSensor>>processEvent:evt does have the events in raw form, no need to translate before forwarding to the StackInterpreter. In principle, we know we CAN get the dang events into the StackIntepreter. How do I make this responsive and useful?
I’d say what you’ve achieved is already quite useful! Maybe filtering out MouseOver events would cut down on the response time? I would imagine the Simulator could be working up a hefty backlog of byte codes to simulate in order to process those...
Cheers, Henry
Henry, Bob and Eliot.
Thanks all.
After I study what Eliot wrote and devise my plan, I will post it back here. Bear in mind that I am new to this. I am experiencing the same difficulty in thinking 'VM' vs 'Smalltalk' as I did when thinking 'Smalltalk' vs 'Algol'; I am sure the light bulb will come on (40 Watt, sadly) eventually, but expect missteps.
BTW, is there an existing working process that the Simulator currently does that I can use as a model? I was planning on mapping what Eliot wrote to what I can see in the Simulator's command window menu as a starting point to get a feel for things. Good place to start ?
cordially,
tty
---- On Thu, 23 Jan 2014 03:32:32 -0800 Bob Arning <arning315@comcast.net> wrote ----
I don't think there is any reason to convert to an OS array - HandMorph>>processEvents already knows the OS event array that was used to generate the squeak event. Making that accessible may be all you need from the front end.
Cheers, Bob
On 1/23/14 4:43 AM, Henrik Johansen wrote:
31-9AAB-46F2-BE03-164F3C2C6759@veloxit.no" type="cite">
On 22 Jan 2014, at 7:05 , Eliot Miranda <eliot.miranda@gmail.com> wrote:
So your challenge is to take the Morphic event, queue it inside StackInterpreterSimulator (e.g. in a new inst var eventQueue), and convert it to an OS event (an Array), so that it can for example implement ioGetNextEvent: as something like
StackInterpreterSimulator>>ioGetNextEvent: evtBuf
eventQueue isEmpty ifTrue:
[^self primitiveFail].
self convertMorphicEvent: eventQueue removeFirst into: evtBuf
As for the conversion into the OS array, searching for "Event Types” in sq.h will probably give a better introduction/faster understand as to the structure of said array than delving into the code doing array -> morphic event conversion in an image. Make sure you get WindowEvent type 6 correct! :)
Cheers, Henry
Hi Tty,
On Thu, Jan 23, 2014 at 5:11 AM, gettimothy gettimothy@zoho.com wrote:
Henry, Bob and Eliot.
Thanks all.
After I study what Eliot wrote and devise my plan, I will post it back here. Bear in mind that I am new to this. I am experiencing the same difficulty in thinking 'VM' vs 'Smalltalk' as I did when thinking 'Smalltalk' vs 'Algol'; I am sure the light bulb will come on (40 Watt, sadly) eventually, but expect missteps.
BTW, is there an existing working process that the Simulator currently does that I can use as a model? I was planning on mapping what Eliot wrote to what I can see in the Simulator's command window menu as a starting point to get a feel for things. Good place to start ?
Sure. But you're going to add event handler code to where that window is created which is StackInterpreterSimulator>>openAsMorph. You're going to need a new inst var in StackInterpreterSimulator to hold the event. As Bob says (wish I'd said it too but I couldn't find it) the Morphic events may hold onto their OS events anyway so you may not even have to convert.
Anyway, dive in, get your hands dirty, and you don't have to commit your mistakes, so we'll never know, only see the shiny finished product ;-)
cordially,
tty
---- On Thu, 23 Jan 2014 03:32:32 -0800 *Bob Arning <arning315@comcast.net arning315@comcast.net>* wrote ----
I don't think there is any reason to convert to an OS array - HandMorph>>processEvents already knows the OS event array that was used to generate the squeak event. Making that accessible may be all you need from the front end.
Cheers, Bob
On 1/23/14 4:43 AM, Henrik Johansen wrote:
31-9AAB-46F2-BE03-164F3C2C6759@veloxit.no" type="cite">
On 22 Jan 2014, at 7:05 , Eliot Miranda eliot.miranda@gmail.com wrote:
So your challenge is to take the Morphic event, queue it inside StackInterpreterSimulator (e.g. in a new inst var eventQueue), and convert it to an OS event (an Array), so that it can for example implement ioGetNextEvent: as something like
StackInterpreterSimulator>>ioGetNextEvent: evtBuf
eventQueue isEmpty ifTrue: [^self primitiveFail].
self convertMorphicEvent: eventQueue removeFirst into: evtBuf
As for the conversion into the OS array, searching for "Event Types” in sq.h will probably give a better introduction/faster understand as to the structure of said array than delving into the code doing array -> morphic event conversion in an image. Make sure you get WindowEvent type 6 correct! :)
Cheers, Henry
I think I see the approach now.
Because the Simulated VM is not actually running as its own operating system process, there are no I/O events coming from the operating system to it via the normal VM paths. However, we can cheat by taking OS events from the host process and "injecting" them into the simulation process in a suitable form such that the simulated UI running on top of the simulated VM cannot tell the difference.
Does that about sum it up?
thx.
tty
On Thu, Jan 23, 2014 at 1:50 PM, gettimothy gettimothy@zoho.com wrote:
I think I see the approach now.
Because the Simulated VM is not actually running as its own operating system process, there are no I/O events coming from the operating system to it via the normal VM paths. However, we can cheat by taking OS events from the host process and "injecting" them into the simulation process in a suitable form such that the simulated UI running on top of the simulated VM cannot tell the difference.
Does that about sum it up?
Perfectly. Why couldn't I say it that way?
thx.
tty
"Perfectly. Why couldn't I say it that way?"
Because you are smarter than I am and know what you are doing! (:
Thanks for your help.
tty.
On Thu, Jan 23, 2014 at 2:27 PM, gettimothy gettimothy@zoho.com wrote:
"Perfectly. Why couldn't I say it that way?"
Because you are smarter than I am and know what you are doing! (:
Not so! I've overcomplicated for a long time. Give yourself a pat on the back for seeing clearly and simply. This is a complex beast you're diving into.
Thanks for your help.
tty.
Eliot,
Without your complex description, I could not have written that simple summary. By you having the know-how of where to look you enabled me to connect the dots.
Don't fear, though, I am sure gum something up!
cheers.
tty
---- On Thu, 23 Jan 2014 14:28:46 -0800 Eliot Miranda<eliot.miranda@gmail.com> wrote ----
On Thu, Jan 23, 2014 at 2:27 PM, gettimothy <gettimothy@zoho.com> wrote:
"Perfectly. Why couldn't I say it that way?"
Because you are smarter than I am and know what you are doing! (:
Not so! I've overcomplicated for a long time. Give yourself a pat on the back for seeing clearly and simply. This is a complex beast you're diving into.
Thanks for your help.
tty.
vm-dev@lists.squeakfoundation.org