I swear I've asked this before but a quick google didn't find anything, so forgive any duplication.
Do we have a swiki package suited to recent (preferably current ) Squeak? The only trace I see evidence of is a very, very, old package in squeakmap, intended for 3.6 with KomService & KommHTTPServer. Surely there is something newer than 2005 out there?
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Littergators resolve disputes about rubbish
The short aswer is Chis modified old code for swiki and now runs in newer as 3.6. The question is: What you wish ? If you wish run your own swiki I could made a 5.2 64 bit version of squearos.org , which is my living playground where I daily synch with real swiki and and all my stuff. Is VERY customizable with external html5,js,css and now with wasm.
User of squearos.org is visita without pass
The box wich serves squeakros.org is a old Pentium IV with Ubuntu Mate 16 some shrinked.
Edgar @morplenauta
Hi Edgar,
On 2019-07-18, at 3:01 AM, Edgar J. De Cleene edgardec2005@gmail.com wrote:
The short aswer is Chis modified old code for swiki and now runs in newer as 3.6. The question is: What you wish ?
I'd like to have a local swiki; nothing more complex than that.
Ideally we would have one that could run off the WebServer already in-image, but one based on Seaside would be fine too.
If you wish run your own swiki I could made a 5.2 64 bit version of squearos.org , which is my living playground where I daily synch with real swiki and and all my stuff. Is VERY customizable with external html5,js,css and now with wasm.
User of squearos.org is visita without pass
Sounds interesting
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim "My name is Inigo Montoya. You killed my parent process. Prepare to vi!"
On 18/07/2019, 14:04, "tim Rowledge" tim@rowledge.org wrote:
Hi Edgar,
On 2019-07-18, at 3:01 AM, Edgar J. De Cleene edgardec2005@gmail.com wrote:
The short aswer is Chis modified old code for swiki and now runs in newer as 3.6. The question is: What you wish ?
I'd like to have a local swiki; nothing more complex than that.
Ideally
we would have one that could run off the WebServer already in-image, but one based on Seaside would be fine too.
If you wish run your own swiki I could made a 5.2 64 bit version of squearos.org , which is my living playground where I daily synch with real swiki and and all my stuff. Is VERY customizable with external html5,js,css and now with wasm.
User of squearos.org is visita without pass
Sounds interesting
tim -- tim Rowledge;
tim@rowledge.org; http://www.rowledge.org/tim
"My name is Inigo Montoya. You
killed my parent process. Prepare to vi!"
I remove some of current .image running http://www.squeakros.org for only related to squeak , no s8 no my stuff and no wasm
Some notes here http://190.193.252.204:9090/renderMe:Friendly%20Swiki
I limit the sycnh to 1 January 2018 Some picts http://squeakros.org/Screen%20Shot%202019-07-19%20at%2009.41.51.jpg http://squeakros.org/Screen%20Shot%202019-07-19%20at%2009.43.22.jpg
"app" Mac directory was zipped and transfer to Linux box Some tweaks and run well
Edgar @morplenauta
So far I've found a lot of big projects that need huge looking piles of infrastructure and mostly Pharo. I'd like something a bit lighter.
I found Ramon Leon's old tutorial based (I think) on Seaside 2 and managed to update the code to make a very (very) basic swiki page. All I really need on top of this is a parser for some variety of simple markup-to-html; I don't care much whether it is 'traditional' swiki or MarkDown etc.
Does anyone have a pointer to something suitable? Surely the code for the main swiki is actually out there somewhere?
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Strange OpCodes: PMT: Punch Magnetic Tape
On 2019-08-04, at 5:30 PM, tim Rowledge tim@rowledge.org wrote:
So far I've found a lot of big projects that need huge looking piles of infrastructure and mostly Pharo. I'd like something a bit lighter.
[snip] Oh, foo; forgot to mention having found an old pointer in ss3 (from 5 years ago!)- (Installer ss3) project: 'SqueakServices'; install: 'Swiki'. ... but it isn't there any more and I couldn't find anything that looked like it.
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Useful random insult:- If he were any more stupid, he'd have to be watered twice a week.
SmaCC doesn’t load into Squeak anymore. As Levente’s post a few back said, the parser to have is in Xtreams. It has a PEG parser with a wiki grammar. You need to write an Actor subclass for it. [1] The VW package comes with Actor subclass examples (i.e. PEG.WikiGenerator).
Chris
[1] https://code.google.com/archive/p/xtreams/wikis/Parsing.wiki
On Aug 4, 2019, at 8:33 PM, tim Rowledge tim@rowledge.org wrote:
On 2019-08-04, at 5:30 PM, tim Rowledge tim@rowledge.org wrote:
So far I've found a lot of big projects that need huge looking piles of infrastructure and mostly Pharo. I'd like something a bit lighter.
[snip] Oh, foo; forgot to mention having found an old pointer in ss3 (from 5 years ago!)- (Installer ss3) project: 'SqueakServices'; install: 'Swiki'. ... but it isn't there any more and I couldn't find anything that looked like it.
tim
tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Useful random insult:- If he were any more stupid, he'd have to be watered twice a week.
Actually, the SmaCC runtime (SmaCC package) still loads. It's just that the dev tools package (SmaCCDev) needs some changes to be up-to-date. So, if you have an existing SmaCC-based parser, it probably still works.
The Xtreams-Parsing package has PEGWikiGenerator, which converts some wiki syntax to xhtml. It uses monty's XML parser, which is a highly extended version of XML-Parser, but it's not backwards compatible with Squeak's version. It loads cleanly into the image, and its 5000+ tests all pass. If you don't use the XML-Parser package, loading Xtreams with Metacello will cause no problems for you.
Another option, as you wrote, is to create a Squeak-compatible PEGActor to generate the html. It should be fairly easy using the PEGWikiGenerator class.
Levente
On Sun, 4 Aug 2019, Chris Cunnington wrote:
SmaCC doesn’t load into Squeak anymore. As Levente’s post a few back said, the parser to have is in Xtreams. It has a PEG parser with a wiki grammar. You need to write an Actor subclass for it. [1] The VW package comes with Actor subclass examples (i.e. PEG.WikiGenerator). Chris
[1] https://code.google.com/archive/p/xtreams/wikis/Parsing.wiki
On Aug 4, 2019, at 8:33 PM, tim Rowledge <tim@rowledge.org> wrote: On 2019-08-04, at 5:30 PM, tim Rowledge <tim@rowledge.org> wrote: So far I've found a lot of big projects that need huge looking piles of infrastructure and mostly Pharo. I'd like something a bit lighter.
[snip] Oh, foo; forgot to mention having found an old pointer in ss3 (from 5 years ago!)- (Installer ss3) project: 'SqueakServices'; install: 'Swiki'. ... but it isn't there any more and I couldn't find anything that looked like it.
tim
tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Useful random insult:- If he were any more stupid, he'd have to be watered twice a week.
Well after trying to look into the sources delivered from the code.google page with no luck I found the SqueakSource repository version. That seems to have loaded ok but references a class XMLString that I have had no success in finding and is of course Undeclared. Anyone know where it can be found?
On 2019-08-04, at 7:48 PM, Levente Uzonyi leves@caesar.elte.hu wrote:
Actually, the SmaCC runtime (SmaCC package) still loads. It's just that the dev tools package (SmaCCDev) needs some changes to be up-to-date. So, if you have an existing SmaCC-based parser, it probably still works.
The Xtreams-Parsing package has PEGWikiGenerator, which converts some wiki syntax to xhtml. It uses monty's XML parser, which is a highly extended version of XML-Parser, but it's not backwards compatible with Squeak's version. It loads cleanly into the image, and its 5000+ tests all pass. If you don't use the XML-Parser package, loading Xtreams with Metacello will cause no problems for you.
Another option, as you wrote, is to create a Squeak-compatible PEGActor to generate the html. It should be fairly easy using the PEGWikiGenerator class.
Levente
On Sun, 4 Aug 2019, Chris Cunnington wrote:
SmaCC doesn’t load into Squeak anymore. As Levente’s post a few back said, the parser to have is in Xtreams. It has a PEG parser with a wiki grammar. You need to write an Actor subclass for it. [1] The VW package comes with Actor subclass examples (i.e. PEG.WikiGenerator). Chris [1] https://code.google.com/archive/p/xtreams/wikis/Parsing.wiki
On Aug 4, 2019, at 8:33 PM, tim Rowledge <tim@rowledge.org> wrote: On 2019-08-04, at 5:30 PM, tim Rowledge <tim@rowledge.org> wrote: So far I've found a lot of big projects that need huge looking piles of infrastructure and mostly Pharo. I'd like something a bit lighter.
[snip] Oh, foo; forgot to mention having found an old pointer in ss3 (from 5 years ago!)- (Installer ss3) project: 'SqueakServices'; install: 'Swiki'. ... but it isn't there any more and I couldn't find anything that looked like it. tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Useful random insult:- If he were any more stupid, he'd have to be watered twice a week.
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Do files get embarrassed when they get unzipped?
On Mon, 5 Aug 2019, tim Rowledge wrote:
Well after trying to look into the sources delivered from the code.google page with no luck I found the SqueakSource repository version. That seems to have loaded ok but references a class XMLString that I have had no success in finding and is of course Undeclared. Anyone know where it can be found?
That's why Xtreams should be loaded via Metacello, so that all dependencies are loaded along with it. Metacello will load monty's version of XML-Parser, which contains the XMLString class (which is still not backwards compatible, so if you need XML for anything else, then you'll have to write a new actor to generate html, but the parser will still be fine as-is). Here's how to do that:
1. Install Metacello from the Docking Bar's Tools menu's Metacello entry 2. Evaluate (Smalltalk at: #ConfigurationOfXtreams) project bleedingEdge load
Levente
P.S.: The actual problem is that people had started to make projects Pharo compatible on squeaksource, often by breaking Squeak compatibility (e.g. SmaCC, and partially Xtreams), and later moved on to smalltalkhub/github, leaving the broken (from both Squeak's and Pharo's POV) code rot on squeaksource.
On 2019-08-04, at 7:48 PM, Levente Uzonyi leves@caesar.elte.hu wrote:
Actually, the SmaCC runtime (SmaCC package) still loads. It's just that the dev tools package (SmaCCDev) needs some changes to be up-to-date. So, if you have an existing SmaCC-based parser, it probably still works.
The Xtreams-Parsing package has PEGWikiGenerator, which converts some wiki syntax to xhtml. It uses monty's XML parser, which is a highly extended version of XML-Parser, but it's not backwards compatible with Squeak's version. It loads cleanly into the image, and its 5000+ tests all pass. If you don't use the XML-Parser package, loading Xtreams with Metacello will cause no problems for you.
Another option, as you wrote, is to create a Squeak-compatible PEGActor to generate the html. It should be fairly easy using the PEGWikiGenerator class.
Levente
On Sun, 4 Aug 2019, Chris Cunnington wrote:
SmaCC doesn’t load into Squeak anymore. As Levente’s post a few back said, the parser to have is in Xtreams. It has a PEG parser with a wiki grammar. You need to write an Actor subclass for it. [1] The VW package comes with Actor subclass examples (i.e. PEG.WikiGenerator). Chris [1] https://code.google.com/archive/p/xtreams/wikis/Parsing.wiki
On Aug 4, 2019, at 8:33 PM, tim Rowledge <tim@rowledge.org> wrote: On 2019-08-04, at 5:30 PM, tim Rowledge <tim@rowledge.org> wrote: So far I've found a lot of big projects that need huge looking piles of infrastructure and mostly Pharo. I'd like something a bit lighter.
[snip] Oh, foo; forgot to mention having found an old pointer in ss3 (from 5 years ago!)- (Installer ss3) project: 'SqueakServices'; install: 'Swiki'. ... but it isn't there any more and I couldn't find anything that looked like it. tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Useful random insult:- If he were any more stupid, he'd have to be watered twice a week.
tim
tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Do files get embarrassed when they get unzipped?
On Aug 5, 2019, at 10:37 PM, Levente Uzonyi leves@caesar.elte.hu wrote:
On Mon, 5 Aug 2019, tim Rowledge wrote:
Well after trying to look into the sources delivered from the code.google page with no luck I found the SqueakSource repository version. That seems to have loaded ok but references a class XMLString that I have had no success in finding and is of course Undeclared. Anyone know where it can be found?
That's why Xtreams should be loaded via Metacello, so that all dependencies are loaded along with it. Metacello will load monty's version of XML-Parser, which contains the XMLString class (which is still not backwards compatible, so if you need XML for anything else, then you'll have to write a new actor to generate html, but the parser will still be fine as-is). Here's how to do that:
- Install Metacello from the Docking Bar's Tools menu's Metacello entry
At the risk of being a petty, corrector kind of guy, and because I just tried this out, it’s the Do menu and Installer ensureRecentMetacello. I think I recall a some what long debate about adding this. It’s clearly quite valuable.
Evaluate this first:
Installer ss project: 'MetacelloRepository'; install: 'ConfigurationOfXtreams’.
And then this:
- Evaluate (Smalltalk at: #ConfigurationOfXtreams) project bleedingEdge load
Or it will complain it has no knowledge of a ConfigurationOfXtreams.
I’m pleased to see that PEGWikiGenerator. Sean did a great thing there, because it’s a nice concrete example without having to fish around in VW. There are two other actor subclasses but the are sort of recursive and trippy. Parsers making parsers making parsers and so on.
Chris
Anthrax - Got The Time https://www.youtube.com/watch?v=be7iNHw8QoQ
Levente
P.S.: The actual problem is that people had started to make projects Pharo compatible on squeaksource, often by breaking Squeak compatibility (e.g. SmaCC, and partially Xtreams), and later moved on to smalltalkhub/github, leaving the broken (from both Squeak's and Pharo's POV) code rot on squeaksource.
On 2019-08-04, at 7:48 PM, Levente Uzonyi leves@caesar.elte.hu wrote: Actually, the SmaCC runtime (SmaCC package) still loads. It's just that the dev tools package (SmaCCDev) needs some changes to be up-to-date. So, if you have an existing SmaCC-based parser, it probably still works. The Xtreams-Parsing package has PEGWikiGenerator, which converts some wiki syntax to xhtml. It uses monty's XML parser, which is a highly extended version of XML-Parser, but it's not backwards compatible with Squeak's version. It loads cleanly into the image, and its 5000+ tests all pass. If you don't use the XML-Parser package, loading Xtreams with Metacello will cause no problems for you. Another option, as you wrote, is to create a Squeak-compatible PEGActor to generate the html. It should be fairly easy using the PEGWikiGenerator class. Levente On Sun, 4 Aug 2019, Chris Cunnington wrote:
SmaCC doesn’t load into Squeak anymore. As Levente’s post a few back said, the parser to have is in Xtreams. It has a PEG parser with a wiki grammar. You need to write an Actor subclass for it. [1] The VW package comes with Actor subclass examples (i.e. PEG.WikiGenerator). Chris [1] https://code.google.com/archive/p/xtreams/wikis/Parsing.wiki
On Aug 4, 2019, at 8:33 PM, tim Rowledge <tim@rowledge.org> wrote: On 2019-08-04, at 5:30 PM, tim Rowledge <tim@rowledge.org> wrote: So far I've found a lot of big projects that need huge looking piles of infrastructure and mostly Pharo. I'd like something a bit lighter.
[snip] Oh, foo; forgot to mention having found an old pointer in ss3 (from 5 years ago!)- (Installer ss3) project: 'SqueakServices'; install: 'Swiki'. ... but it isn't there any more and I couldn't find anything that looked like it. tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Useful random insult:- If he were any more stupid, he'd have to be watered twice a week.
tim
tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Do files get embarrassed when they get unzipped?
OK, so allowing metacello to do its thing loaded up a boatload of stuff including FFI (wtf?) that really isn't going to be of any use for this problem, but whatever.
(And, as an aside, does anyone else get a bit irritated by the plethora of ways stuff gets loaded? So far in my recent quest I have had to use metacello #configuration:/#load, #baseline:/#repository/#load, install #project:/#addPackage:/#install, installer #merge:/MaInstaller #merge:, SqueakMap and plain MC loading. Wheee!)
But still no XMLString. The metacello configurations all seem to use 'Xtreams-Parsing-Martin Kobetic.2' in the only reference to any package with a name including 'pars'. The problem with googling for 'XMLString' is the commonality of the dratted word. Once one finds the right key it always seems so obvious... hindsight being so very clear.
I finally found a version of 'monty's xml-parser' on squeakmap which a) took ages to install b) overwrites an XML-Parser category we already have in Squeak 5.2, so who knows what gets mixed up c) does actually, finally, include a class named XMLString. I wonder if it will be friendly? d) adds *11* new entries to Undeclared! Good grief.
What a mess we've let ourselves get into.
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Any nitwit can understand computers. Many do.
Copy this code into a Workspace and DoIt.
Installer ss project: 'MetacelloRepository'; install: 'ConfigurationOfXMLParser'. (Smalltalk at: #ConfigurationOfXMLParser) project bleedingEdge load
wikiGrammar := PEGParser grammarWiki reading. wikiParser := PEGParser parserPEG parse: 'Grammar' stream: wikiGrammar actor: PEGParserParser new. input := 'Single paragraph with *bold* and _italic_ text and a [link]' reading. wikiParser parse: 'Page' stream: input actor: PEGWikiGenerator new
And you’ll get this.
<div><p>Single paragraph with <span style="font-weight: bold">bold</span> and <span style="font-style: italic">italic</span> text and a <a href="link.html">an OrderedCollection($l $i $n $k)</a></p></div>
And, yes, that OrderedCollection is an error meaning it didn’t work entirely. To get insight into that, execute this. You will see the process in medias re. The actor has been changed to nil. What this provides is what you need the Actor subclass for.
wikiGrammar := PEGParser grammarWiki reading. wikiParser := PEGParser parserPEG parse: 'Grammar' stream: wikiGrammar actor: PEGParserParser new. input := 'Single paragraph with *bold* and _italic_ text and a [link]' reading. wikiParser parse: 'Page' stream: input actor: nil
Chris
Judas Priest - Living After Midnight https://www.youtube.com/watch?v=_gopyByrAtY
On Aug 6, 2019, at 12:28 AM, tim Rowledge tim@rowledge.org wrote:
OK, so allowing metacello to do its thing loaded up a boatload of stuff including FFI (wtf?) that really isn't going to be of any use for this problem, but whatever.
(And, as an aside, does anyone else get a bit irritated by the plethora of ways stuff gets loaded? So far in my recent quest I have had to use metacello #configuration:/#load, #baseline:/#repository/#load, install #project:/#addPackage:/#install, installer #merge:/MaInstaller #merge:, SqueakMap and plain MC loading. Wheee!)
But still no XMLString. The metacello configurations all seem to use 'Xtreams-Parsing-Martin Kobetic.2' in the only reference to any package with a name including 'pars'. The problem with googling for 'XMLString' is the commonality of the dratted word. Once one finds the right key it always seems so obvious... hindsight being so very clear.
I finally found a version of 'monty's xml-parser' on squeakmap which a) took ages to install b) overwrites an XML-Parser category we already have in Squeak 5.2, so who knows what gets mixed up c) does actually, finally, include a class named XMLString. I wonder if it will be friendly? d) adds *11* new entries to Undeclared! Good grief.
What a mess we've let ourselves get into.
tim
tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Any nitwit can understand computers. Many do.
On 2019-08-05, at 10:50 PM, Chris Cunnington brasspen@gmail.com wrote:
Copy this code into a Workspace and DoIt.
Installer ss project: 'MetacelloRepository'; install: 'ConfigurationOfXMLParser'. (Smalltalk at: #ConfigurationOfXMLParser) project bleedingEdge load
OK, so yet another "I couldn't spot this with google" moment. What fun.
It does at least load. And then of course I had to load the XTreams stuff because I foolishly thought trying a cleaner image might be smart after all the 'fun' of previous attempts.
wikiGrammar := PEGParser grammarWiki reading. wikiParser := PEGParser parserPEG parse: 'Grammar' stream: wikiGrammar actor: PEGParserParser new. input := 'Single paragraph with *bold* and _italic_ text and a [link]' reading. wikiParser parse: 'Page' stream: input actor: PEGWikiGenerator new
And you’ll get this.
<div><p>Single paragraph with <span style="font-weight: bold">bold</span> and <span style="font-style: italic">italic</span> text and a <a href="link.html">an OrderedCollection($l $i $n $k)</a></p></div>
Just for fun I did try that in the earlier-dirty image and it failed because somewher it decided that 'link' was an orderedcollection of symbols instead of a string. Made for an amusing crash within XMLWriter>>#write:escapedWith:
In a start-from-clean image it did actually do what your show. Thank you for that. I really think a simpler solution would be nice here. Adding 4Mb to an image for a simple swiki markup parser seems a bit much.
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Strange OpCodes: RDL: Rotate Disk Left
On Tue, 6 Aug 2019, tim Rowledge wrote:
On 2019-08-05, at 10:50 PM, Chris Cunnington brasspen@gmail.com wrote:
Copy this code into a Workspace and DoIt.
Installer ss project: 'MetacelloRepository'; install: 'ConfigurationOfXMLParser'. (Smalltalk at: #ConfigurationOfXMLParser) project bleedingEdge load
OK, so yet another "I couldn't spot this with google" moment. What fun.
It does at least load. And then of course I had to load the XTreams stuff because I foolishly thought trying a cleaner image might be smart after all the 'fun' of previous attempts.
Right. I also failed to notice it was for XMLParser instead of Xtreams. The following should load Xtreams and its dependencies, including monty's XMLParser into a fresh image:
Installer ensureRecentMetacello. Installer ss project: 'MetacelloRepository'; install: 'ConfigurationOfXtreams'. (Smalltalk at: #ConfigurationOfXtreams) project bleedingEdge load
wikiGrammar := PEGParser grammarWiki reading. wikiParser := PEGParser parserPEG parse: 'Grammar' stream: wikiGrammar actor: PEGParserParser new. input := 'Single paragraph with *bold* and _italic_ text and a [link]' reading. wikiParser parse: 'Page' stream: input actor: PEGWikiGenerator new
And you’ll get this.
<div><p>Single paragraph with <span style="font-weight: bold">bold</span> and <span style="font-style: italic">italic</span> text and a <a href="link.html">an OrderedCollection($l $i $n $k)</a></p></div>
Just for fun I did try that in the earlier-dirty image and it failed because somewher it decided that 'link' was an orderedcollection of symbols instead of a string. Made for an amusing crash within XMLWriter>>#write:escapedWith:
In a start-from-clean image it did actually do what your show. Thank you for that. I really think a simpler solution would be nice here. Adding 4Mb to an image for a simple swiki markup parser seems a bit much.
I wrote and attached a different actor, which builds a simple dom tree, which can be turned into a string:
GoogleWikiCompiler example asString
should give
'<div><p>Single paragraph with <strong>bold</strong> and <em>italic</em> text and a <a href="link.html">link</a></p></div>'
It has no external dependencies but Xtreams-Parsing, so it doesn't need monty's XMLParser to be loaded.
Levente
tim
tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Strange OpCodes: RDL: Rotate Disk Left
On Aug 10, 2019, at 6:04 PM, Levente Uzonyi leves@caesar.elte.hu wrote:
On Tue, 6 Aug 2019, tim Rowledge wrote:
On 2019-08-05, at 10:50 PM, Chris Cunnington brasspen@gmail.com wrote: Copy this code into a Workspace and DoIt. Installer ss project: 'MetacelloRepository'; install: 'ConfigurationOfXMLParser'. (Smalltalk at: #ConfigurationOfXMLParser) project bleedingEdge load
OK, so yet another "I couldn't spot this with google" moment. What fun.
It does at least load. And then of course I had to load the XTreams stuff because I foolishly thought trying a cleaner image might be smart after all the 'fun' of previous attempts.
Right. I also failed to notice it was for XMLParser instead of Xtreams. The following should load Xtreams and its dependencies, including monty's XMLParser into a fresh image:
Installer ensureRecentMetacello. Installer ss project: 'MetacelloRepository'; install: 'ConfigurationOfXtreams'. (Smalltalk at: #ConfigurationOfXtreams) project bleedingEdge load
+1
wikiGrammar := PEGParser grammarWiki reading. wikiParser := PEGParser parserPEG parse: 'Grammar' stream: wikiGrammar actor: PEGParserParser new. input := 'Single paragraph with *bold* and _italic_ text and a [link]' reading. wikiParser parse: 'Page' stream: input actor: PEGWikiGenerator new And you’ll get this. <div><p>Single paragraph with <span style="font-weight: bold">bold</span> and <span style="font-style: italic">italic</span> text and a <a href="link.html">an OrderedCollection($l $i $n $k)</a></p></div>
Just for fun I did try that in the earlier-dirty image and it failed because somewher it decided that 'link' was an orderedcollection of symbols instead of a string. Made for an amusing crash within XMLWriter>>#write:escapedWith:
In a start-from-clean image it did actually do what your show. Thank you for that. I really think a simpler solution would be nice here. Adding 4Mb to an image for a simple swiki markup parser seems a bit much.
I wrote and attached a different actor, which builds a simple dom tree, which can be turned into a string:
GoogleWikiCompiler example asString
should give
'<div><p>Single paragraph with <strong>bold</strong> and <em>italic</em> text and a <a href="link.html">link</a></p></div>'
It has no external dependencies but Xtreams-Parsing, so it doesn't need monty's XMLParser to be loaded.
+1
Chris
Levente
tim
tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Strange OpCodes: RDL: Rotate Disk Left
<GoogleWikiCompiler.st>
On 2019-08-10, at 3:04 PM, Levente Uzonyi leves@caesar.elte.hu wrote:
I wrote and attached a different actor, which builds a simple dom tree, which can be turned into a string:
GoogleWikiCompiler example asString
should give
'<div><p>Single paragraph with <strong>bold</strong> and <em>italic</em> text and a <a href="link.html">link</a></p></div>'
It has no external dependencies but Xtreams-Parsing, so it doesn't need monty's XMLParser to be loaded.
Excellent. That would simplify life a bit.
I spent some fun time trying to work out how on earth the parserparserparserparser (did I go deep enough yet?) works; it's very clever. The problem is that it is edging into the cleverness danger zone - remember that it takes roughly twice as much 'clever' to debug something as it does to write it, and so writing something with all your clever means that you won't be able to debug it.
Eventually I got the hang of enough to spot where the link problem was having trouble. Basically, the Sequence rule ends up making [link] into '[' and #( $l $i $n $k) and ']'. By what I suspect is mostly good luck the xml printing that gets used to convert it into the content of the html anchor tag wants a collection of characters to iterate over rather than just a String to print, so that bit works. The #newText: method wants an actual String; as a hack fix I made that convert the address parameter to an actual String. Now that's all very well and might even be the right solution but it could also be a bug in the source grammar for all I know. It's not like the package is exactly overloaded with helpful comments.
I'll take a look at your googlewiki later; got scaffolding to dismantle and reconfigure for some high altitude painting.
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Strange OpCodes: RC: Rewind Core
Well, here's a weird thing that I don't recall ever coming up against before.
I've got a basic swiki page component for Seaside, as previously mentioned. Now we have Levente's (original plus small fixes) parser/generator to convert swiki markup to html. I join them together to see what happens ... and weird stuff happens. Testing in a workspace is fine BUT not when getting text back from a browser.
It turns out that the browser (safari and chrome) sends a string with CRLF line ends to us. Now, I'm a long way from keeping up to date with web stuff but really? I thought we got LFs because unixy-things.
The practical issue is that the grammar provided in PEGParser class>>#grammarWiki has a load of places reliant on \n and so, for example, the Preformatted & Code tags simply get ignored.
Two obvious questions come to mind here a) what on earth? CRLF? Is that normal or is it an artefact of some Seaside setup I can change? b) if we need to change the grammar to cope with crlf, what is the best way? I don't find the grammar terribly intuitable and can't spot any rule explanation. I've tried changing the Preformatted rule for example to Preformatted <- "---\r\n" .{"---\r\n"} and the parser doesn't even recognise the swiki tags.
I'd hate to have to do a crlf -> lf conversion every time, it seems so inelegant.
Oh - http://code.google.com/p/support/wiki/WikiSyntax (as referenced in #grammarWiki) seems to be a dead page now, which makes it a not very good bit of documentation! Bizarrely there doesn't appear to be much related info found by google.
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Strange OpCodes: CM: Circulate Memory
Am Di., 13. Aug. 2019 um 22:46 Uhr schrieb tim Rowledge tim@rowledge.org:
It turns out that the browser (safari and chrome) sends a string with CRLF line ends to us. Now, I'm a long way from keeping up to date with web stuff but really? I thought we got LFs because unixy-things. [...]
Two obvious questions come to mind here a) what on earth? CRLF? Is that normal or is it an artefact of some Seaside setup I can change?
CRLF is common among Internet text protocols, such as HTTP and SMTP. (https://tools.ietf.org/html/rfc2616#section-2.2 https://tools.ietf.org/html/rfc5321#section-2.3.8)
For textarea form elements in HTML, see https://stackoverflow.com/questions/14217101/what-character-represents-a-new... Seems like CRLF is up to standards there as well, at least for the representation on the wire.
In general, while HTTP does not mandate the end of line marker for the body of the message (notwithstanding HTML form specs), it says in RFC2616: "When in canonical form, media subtypes of the "text" type use CRLF as the text line break. HTTP relaxes this requirement and allows the transport of text media with plain CR or LF alone representing a line break when it is done consistently for an entire entity-body. HTTP applications MUST accept CRLF, bare CR, and bare LF as being representative of a line break in text media received via HTTP." (https://tools.ietf.org/html/rfc2616#section-3.7.1)
Hi Tim,
On Tue, 13 Aug 2019, tim Rowledge wrote:
Well, here's a weird thing that I don't recall ever coming up against before.
I've got a basic swiki page component for Seaside, as previously mentioned. Now we have Levente's (original plus small fixes) parser/generator to convert swiki markup to html. I join them together to see what happens ... and weird stuff happens. Testing in a workspace is fine BUT not when getting text back from a browser.
If you use Seaside, its #renderOn: implementations may be clashing with StreamingHtmlCanvas's. I just checked three such methods in my image (Seaside2 + StreamingHtmlCanvas loaded at the same time), and the implementations are the same, so no issue there.
But, if you use Seaside, you might want to consider using TinyWiki.
It turns out that the browser (safari and chrome) sends a string with CRLF line ends to us. Now, I'm a long way from keeping up to date with web stuff but really? I thought we got LFs because unixy-things.
The practical issue is that the grammar provided in PEGParser class>>#grammarWiki has a load of places reliant on \n and so, for example, the Preformatted & Code tags simply get ignored.
Two obvious questions come to mind here a) what on earth? CRLF? Is that normal or is it an artefact of some Seaside setup I can change? b) if we need to change the grammar to cope with crlf, what is the best way? I don't find the grammar terribly intuitable and can't spot any rule explanation. I've tried changing the Preformatted rule for example to Preformatted <- "---\r\n" .{"---\r\n"} and the parser doesn't even recognise the swiki tags.
I'd hate to have to do a crlf -> lf conversion every time, it seems so inelegant.
Oh - http://code.google.com/p/support/wiki/WikiSyntax (as referenced in #grammarWiki) seems to be a dead page now, which makes it a not very good bit of documentation! Bizarrely there doesn't appear to be much related info found by google.
crlf is pretty much an internet thing. Even Squeak's converter method is called #withInternetLineEndings.
I have attached updated GoogleWikiCompiler and #grammarWiki implementations. The new grammar should accept crlf, cr and lf line endings. Weird thing is that \n refers to cr, while \r refers to lf in PEGParser's grammar, so crlf is \n\r...
The google page is gone, but its content is not: https://web.archive.org/web/20150418033327/http://code.google.com/p/support/...
Levente
tim
tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Strange OpCodes: CM: Circulate Memory
If I may jump in on this conversation.
I have an alpha version of what I call SeasideDoc (I will be uploading to SqueakSource once I get some account stuff resolved)
http://menmachinesmaterials.com/SeasideDoc
The landing page gives the roadmap I have in mind.
In the left hand menu, if you click on Swiki, it expands to all the current Swiki pages which appear to display just fine.
The methodology was that I scraped the wiki content and stored it in classes in the image. The classes are subclasses of the standard Seaside renderable component that I call 'doclets'
My apologies if this is irrelevant to what you are trying to accomplish.
<PEGParser class-grammarWiki.st><GoogleWikiCompiler.2.st>
Thank you for the 'fixed' grammar. It even makes sort-of-sense to me, though I couldn't replicate it myself. In my current setup the code/pre/list etc stuff is now working nicely. With a tiny bit of furniture it will make a very good start to a usable wiki.
The only 'TinyWiki' stuff google threw up was an html-only approach - was that the one you were thinking of?
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Useful random insult:- Has a one-way ticket on the Disoriented Express.
On Aug 14, 2019, at 6:15 PM, tim Rowledge tim@rowledge.org wrote:
The only 'TinyWiki' stuff google threw up was an html-only approach - was that the one you were thinking of?
No. SqueakSource since forever has had a wiki for every project. It’s called TinyWiki. It’s the kernel of Lukas Reggli’s Pier wiki CMS. It requires SmaCC, which Levente says can be loaded in 5.2. I guess if you change SmalltalkImage>>#associationAt:ifAbsent: to Smalltalk globals or something. TinyWiki uses the Visitor pattern and is gem. You can download it here [1]. You’ll need to add this initialize method.[2] Then you can register it as a regular Seaside component. [3]
Chris
Nazareth - This Flight Tonight https://www.youtube.com/watch?v=ylW6sC6NNhY
[1]
Installer ss project: 'SmaccDevelopment'; install: 'SmaCC-lr.14.mcz'; install: 'SmaCCDev-lr.18.mcz'.
Installer ss project: 'ss2'; install: 'TinyWiki'.
[2]
TWWiki>>initialize super initialize. self setModel: TWFolder new initializeHelp
[3]
TWWiki registerAsApplication: 'TinyWiki'
On 2019-08-14, at 4:12 PM, Chris Cunnington brasspen@gmail.com wrote:
On Aug 14, 2019, at 6:15 PM, tim Rowledge tim@rowledge.org wrote:
The only 'TinyWiki' stuff google threw up was an html-only approach - was that the one you were thinking of?
No. SqueakSource since forever has had a wiki for every project. It’s called TinyWiki. It’s the kernel of Lukas Reggli’s Pier wiki CMS.
OK. An obvious question here is why that just didn't appear in my searching. This has been a bit of a theme recently; looks like we are failing at being noticed...
I'll give it a try.
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Oxymorons: Legally drunk
I would like to say the the first rule of fight club is that you don’t talk about fight club, but I imagine it’s that Seaside creates random urls and so cannot be scraped by Google.
Chris
On Aug 15, 2019, at 2:57 PM, tim Rowledge tim@rowledge.org wrote:
On 2019-08-14, at 4:12 PM, Chris Cunnington brasspen@gmail.com wrote:
On Aug 14, 2019, at 6:15 PM, tim Rowledge tim@rowledge.org wrote:
The only 'TinyWiki' stuff google threw up was an html-only approach - was that the one you were thinking of?
No. SqueakSource since forever has had a wiki for every project. It’s called TinyWiki. It’s the kernel of Lukas Reggli’s Pier wiki CMS.
OK. An obvious question here is why that just didn't appear in my searching. This has been a bit of a theme recently; looks like we are failing at being noticed...
I'll give it a try.
tim
tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Oxymorons: Legally drunk
On 2019-08-15, at 12:01 PM, Chris Cunnington brasspen@gmail.com wrote:
I would like to say the the first rule of fight club is that you don’t talk about fight club, but I imagine it’s that Seaside creates random urls and so cannot be scraped by Google.
It's not just with google that we fail though, sadly. The search within squeaksource is not so useful either. For example almost the first thing I did when you mentioned it was to look for the 'ss2' project - which didn't get found. Sure 'tinywiki' was found but we must be able to do better at this kind of thing somehow. More thoughtful tagging would help a lot.
One of the real annoyances I have with the 'cool names clique' that has long existed in software project naming-land is that it horribly obscures things. In fact that annoyance was why I labelled the host windows code as 'Ariethfa Ffenestri' - as a tweak to the nose of 'cool names', since it is just a literal translation of the 'proper' project name.
Obviously one thing that would help a lot is us collectively making better use of SqueakMap. Fat chance of that...
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Any nitwit can understand computers. Many do.
My idea was just to export everything as sources in a comprehensive hierarchy. This could be loaded from sources with minimal tooling (i.e. cuis) via Sake/Packages, where Sake handles the dependencies. This worked reasonably well and could handle the whole Seaside dependency tree (for which they wrote a custom website)
Keith
Loading TinyWiki is actually a smidge simpler than than suggested but there are two methods that need tweaking and it sure could use some htlml layout help.
Oh and it doesn't save anything anywhere except in-image. That may be a good thing for some purposes or it may be a bad thing. I tihnk it would be better saved to files in general, though I may try dumping to a postgresv3 connected database just for grins. -------------- Installer ss project: 'SmaccDevelopment'; install: 'SmaCC-lr.14.mcz'. " install: ‘SmaCCDev-lr.18.mcz’. <— this doesn’t appear to be needed and avoids the problem with SmalltalkImage>>#associationAt:ifAbsent:”
Installer ss project: 'ss2'; install: 'TinyWiki'.
“then edit in these two methods and run the class init” TWWiki>>initialize super initialize. self setModel: TWFolder new initializeHelp
TWWiki class>>#initialize "self initialize" WAAdmin register: self asApplicationAt: #tinywiki -----------------
Start the WAWebServerAdaptor on port 8080 Browse to localhost:8080/tinywiki
The 'action buttons' are in a vertical oriented list, which looks awful; any hints on changing that would be welcome; I did compare the produced html with that for the tinywikis on squeaksource and it looked very similar. In the simple file wiki based on Ramon's blog I used a pair of buttons within a form to get side-by-side buttons. I've no idea which might be the 'nice' way to do it. I have a horrible feeling it might involve CSS.
So that's an outline of one of the two approaches that appear practical...
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Oxymorons: Clearly misunderstood
On Aug 15, 2019, at 7:33 PM, tim Rowledge tim@rowledge.org wrote:
The 'action buttons' are in a vertical oriented list, which looks awful; any hints on changing that would be welcome; I did compare the produced html with that for the tinywikis on squeaksource and it looked very similar.
The method that renders those links is #renderWikiActionsOn:. It collects an array from #actions and iterates over them. That’s why it's a vertical list.
Chris
On 2019-08-16, at 6:57 AM, Chris Cunnington brasspen@gmail.com wrote:
On Aug 15, 2019, at 7:33 PM, tim Rowledge tim@rowledge.org wrote:
The 'action buttons' are in a vertical oriented list, which looks awful; any hints on changing that would be welcome; I did compare the produced html with that for the tinywikis on squeaksource and it looked very similar.
The method that renders those links is #renderWikiActionsOn:. It collects an array from #actions and iterates over them. That’s why it's a vertical list.
That's what I thought until I looked at the html for the squeaksource wiki page and saw the same construct resulting in a horizontal row. At which point I decided that I was correct in my long held opinion that html has become a gateway to insanity.
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Don't diddle code to make it faster; find a better algorithm.
On Aug 15, 2019, at 7:33 PM, tim Rowledge tim@rowledge.org wrote:
Oh and it doesn't save anything anywhere except in-image. That may be a good thing for some purposes or it may be a bad thing. I tihnk it would be better saved to files in general, though I may try dumping to a postgresv3 connected database just for grins.
I’ve got a demo image of TinyWiki displaying Swiki page content from a new server. [1] About 10 pages. This image has a tool prototype called SwikiToTinyWiki which opens Swiki XML files in the pages directory and makes objects that the TinyWiki Visitor pattern can use.
Chris
[1]
Squeak 4.4 can be opened with Cog 4.0.3427 or similar (non-Spur) localhost:8080/tinywiki
https://www.dropbox.com/s/qx2nvlwd66d0zwg/TinyWiki%3ASwiki.zip?dl=0
On 2019-08-16, at 11:57 AM, Chris Cunnington brasspen@gmail.com wrote:
On Aug 15, 2019, at 7:33 PM, tim Rowledge tim@rowledge.org wrote:
Oh and it doesn't save anything anywhere except in-image. That may be a good thing for some purposes or it may be a bad thing. I tihnk it would be better saved to files in general, though I may try dumping to a postgresv3 connected database just for grins.
I’ve got a demo image of TinyWiki displaying Swiki page content from a new server.
Interesting. A lot of stuff in that image!
(Mind you, I did manage to completely lock it up by trying to visit examplebrowser; as best I can tell it was choking on trying to find the sources file. Whilst it's perfectly ok to complain about not finding a total lockup is a bit bad. I hope we've improved that bit since 4.notmuch )
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim May the bugs of many programs nest on your hard drive.
With much helpful explanation from Levente I was able to make just enough sense of the PEG parser grammar and operation to add a horizontal rule rule, and then to factor the heading rule to count the number of $= to work out the heading level rather than having level specific tags. Trying to do the same for lists has, however completely eluded me. And the table cells simply can't have anything other than plain-ish text in them without making stuff go very odd.
The other problem is that the code is currently really set up to use google's swiki syntax, which appears to be as dead as a very dead thing. Which means it would be nice to rework it to use some other syntax that has at least a bit of current usage. Would using Markdown with (some) github extensions suit us? Is anyone feeling conversant enough with a PEG grammar to tackle that ?
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Useful random insult:- One clown short of a circus.
Folks: I happy with my only Squeak solution for a personal/sharing swiki. But people on external to Smalltalk world uses another tools. Some exploration show React is popular this days so searching some not complicate wiki made with it. I was able to made with https://docusaurus.io/en/ It uses .md files for serving to web. How you convert http://wiki.squeak.org/squeak to lots of .md ? Not so easy I found https://pandoc.org/ which seems a wonderful tool to have But unlucky us only works with utf-8 and not ISO-8859-1
curl --silent http://wiki.squeak.org/squeak/1 | pandoc --from html --to markdown_strict -o 1.md UTF-8 decoding error in - at byte offset 7466 (92). The input must be a UTF-8 encoded text.
Some more work and pandoc -f html -t markdown 1.html -o 1.md made a readable document Still have some to rip, but is usable I dedicate the week end to have the complete .md documents. Feedback ?
Edgar @morplenauta
... and the other approach outline -
Loading up the Xtreams, the variant of Ramon's blog-post wikipage, Levente's parser and some other fiddling by me goes like this
"install SimpleFileWiki.st"
Installer ss project: 'MetacelloRepository'; install: 'ConfigurationOfXtreams'. (Smalltalk at: #ConfigurationOfXtreams) project bleedingEdge load.
"install /home/pi/DizietFS/Desktop/GoogleWikiCompiler.st"
and then the same Seaside startup/init as before.
The downside of this approach is the much, much, bigger lump of code installed by Xtreams. If, however, you want the (apparently, no personal opinion on this yet) better streaming, and if you have use fo the FFI stuff, then this may be a price worth paying. Clearly the PEGParser stuff seems a bit more expansive and perhaps more up to date than the SmaCC package.
It wouldn't take much to enable either swikipage class/package to use the other parser, of course. Both really only interface with a parser in one place.
What would seem like a really good idea to me would be getting a decent variant (by whatever definition on chooses) included in the SeaSide package(s) since swiki is a very good demo that has real uses beyond mere demo-ness.
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Strange OpCodes: RDLI: Rotate Disk Left Immediate
On Thu, 15 Aug 2019, tim Rowledge wrote:
... and the other approach outline -
Loading up the Xtreams, the variant of Ramon's blog-post wikipage, Levente's parser and some other fiddling by me goes like this
"install SimpleFileWiki.st"
Installer ss project: 'MetacelloRepository'; install: 'ConfigurationOfXtreams'. (Smalltalk at: #ConfigurationOfXtreams) project bleedingEdge load.
"install /home/pi/DizietFS/Desktop/GoogleWikiCompiler.st"
and then the same Seaside startup/init as before.
The downside of this approach is the much, much, bigger lump of code installed by Xtreams. If, however, you want the (apparently, no personal opinion on this yet) better streaming, and if you have use fo the FFI stuff, then this may be a price worth paying. Clearly the PEGParser stuff seems a bit more expansive and perhaps more up to date than the SmaCC package.
Well, you can install only what you need from Xtreams (without Metacello, stuff that depends on FFI, like Xtream-Xtras, or dependencies like XML-Parser):
Installer ss project: 'Xtreams'; addPackage: 'Xtreams-Support'; addPackage: 'Xtreams-Core'; addPackage: 'Xtreams-Terminals'; addPackage: 'Xtreams-TerminalsFileDirectory'; addPackage: 'Xtreams-Transforms'; addPackage: 'Xtreams-Substreams'; addPackage: 'Xtreams-Parsing'; install.
Levente
It wouldn't take much to enable either swikipage class/package to use the other parser, of course. Both really only interface with a parser in one place.
What would seem like a really good idea to me would be getting a decent variant (by whatever definition on chooses) included in the SeaSide package(s) since swiki is a very good demo that has real uses beyond mere demo-ness.
tim
tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Strange OpCodes: RDLI: Rotate Disk Left Immediate
On Wed, 14 Aug 2019, tim Rowledge wrote:
<PEGParser class-grammarWiki.st><GoogleWikiCompiler.2.st>
Thank you for the 'fixed' grammar. It even makes sort-of-sense to me, though I couldn't replicate it myself. In my current setup the code/pre/list etc stuff is now working nicely. With a tiny bit of furniture it will make a very good start to a usable wiki.
The only 'TinyWiki' stuff google threw up was an html-only approach - was that the one you were thinking of?
TinyWiki is the wiki used by SqueakSource to provide wikis for the projects. So, I suggest searching for TinyWiki on squeaksource.com.
Levente
tim
tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Useful random insult:- Has a one-way ticket on the Disoriented Express.
On Mon, 5 Aug 2019, Chris Cunnington wrote:
On Aug 5, 2019, at 10:37 PM, Levente Uzonyi leves@caesar.elte.hu wrote:
On Mon, 5 Aug 2019, tim Rowledge wrote:
Well after trying to look into the sources delivered from the code.google page with no luck I found the SqueakSource repository version. That seems to have loaded ok but references a class XMLString that I have had no success in finding and is of course Undeclared. Anyone know where it can be found?
That's why Xtreams should be loaded via Metacello, so that all dependencies are loaded along with it. Metacello will load monty's version of XML-Parser, which contains the XMLString class (which is still not backwards compatible, so if you need XML for anything else, then you'll have to write a new actor to generate html, but the parser will still be fine as-is). Here's how to do that:
- Install Metacello from the Docking Bar's Tools menu's Metacello entry
At the risk of being a petty, corrector kind of guy, and because I just tried this out, it’s the Do menu and Installer ensureRecentMetacello. I think I recall a some what long debate about adding this. It’s clearly quite valuable.
It was there a few months ago, but since then it's been moved to Tools.
Evaluate this first:
Installer ss project: 'MetacelloRepository'; install: 'ConfigurationOfXtreams’.
Right, I that step was missing.
And then this:
- Evaluate (Smalltalk at: #ConfigurationOfXtreams) project bleedingEdge load
Or it will complain it has no knowledge of a ConfigurationOfXtreams.
I’m pleased to see that PEGWikiGenerator. Sean did a great thing there, because it’s a nice concrete example without having to fish around in VW. There are two other actor subclasses but the are sort of recursive and trippy. Parsers making parsers making parsers and so on.
It's indeed a nice example. Too bad it needs external packages.
Levente
Chris
Anthrax - Got The Time https://www.youtube.com/watch?v=be7iNHw8QoQ
Levente
P.S.: The actual problem is that people had started to make projects Pharo compatible on squeaksource, often by breaking Squeak compatibility (e.g. SmaCC, and partially Xtreams), and later moved on to smalltalkhub/github, leaving the broken (from both Squeak's and Pharo's POV) code rot on squeaksource.
On 2019-08-04, at 7:48 PM, Levente Uzonyi leves@caesar.elte.hu wrote: Actually, the SmaCC runtime (SmaCC package) still loads. It's just that the dev tools package (SmaCCDev) needs some changes to be up-to-date. So, if you have an existing SmaCC-based parser, it probably still works. The Xtreams-Parsing package has PEGWikiGenerator, which converts some wiki syntax to xhtml. It uses monty's XML parser, which is a highly extended version of XML-Parser, but it's not backwards compatible with Squeak's version. It loads cleanly into the image, and its 5000+ tests all pass. If you don't use the XML-Parser package, loading Xtreams with Metacello will cause no problems for you. Another option, as you wrote, is to create a Squeak-compatible PEGActor to generate the html. It should be fairly easy using the PEGWikiGenerator class. Levente On Sun, 4 Aug 2019, Chris Cunnington wrote:
SmaCC doesn’t load into Squeak anymore. As Levente’s post a few back said, the parser to have is in Xtreams. It has a PEG parser with a wiki grammar. You need to write an Actor subclass for it. [1] The VW package comes with Actor subclass examples (i.e. PEG.WikiGenerator). Chris [1] https://code.google.com/archive/p/xtreams/wikis/Parsing.wiki
On Aug 4, 2019, at 8:33 PM, tim Rowledge <tim@rowledge.org> wrote: On 2019-08-04, at 5:30 PM, tim Rowledge <tim@rowledge.org> wrote: So far I've found a lot of big projects that need huge looking piles of infrastructure and mostly Pharo. I'd like something a bit lighter.
[snip] Oh, foo; forgot to mention having found an old pointer in ss3 (from 5 years ago!)- (Installer ss3) project: 'SqueakServices'; install: 'Swiki'. ... but it isn't there any more and I couldn't find anything that looked like it. tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Useful random insult:- If he were any more stupid, he'd have to be watered twice a week.
tim
tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Do files get embarrassed when they get unzipped?
On 2019-08-06, at 5:34 AM, Levente Uzonyi leves@caesar.elte.hu wrote:
On Mon, 5 Aug 2019, Chris Cunnington wrote:
Here's how to do that:
- Install Metacello from the Docking Bar's Tools menu's Metacello entry
At the risk of being a petty, corrector kind of guy, and because I just tried this out, it’s the Do menu and Installer ensureRecentMetacello. I think I recall a some what long debate about adding this. It’s clearly quite valuable.
It was there a few months ago, but since then it's been moved to Tools.
In my 5.3-18694 image it actually gets listed in the Tools menu as just 'Metacello'. Now, if it were the case that the first time you used it the metacello stuff were installed and thereafter some tool opened, that would be ok. But that isn't what seems to happen; choosse that option again and you get to install metacello again. That seems a poor choice to have in a menu that is basically for opening tools.
- Evaluate (Smalltalk at: #ConfigurationOfXtreams) project bleedingEdge load
Or it will complain it has no knowledge of a ConfigurationOfXtreams. I’m pleased to see that PEGWikiGenerator. Sean did a great thing there, because it’s a nice concrete example without having to fish around in VW. There are two other actor subclasses but the are sort of recursive and trippy. Parsers making parsers making parsers and so on.
It's indeed a nice example. Too bad it needs external packages.
Yeah. Especially one that doesn't load properly from SM despite that being the only load-path I was able to google. There has to be a better way of finding stuff, surely.
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Useful random insult:- Ignorant, and proud of it.
Just for future reference - Edgar was referring to www.squeakros.org, not www.squearos.org ....
On 2019-07-18, at 3:01 AM, Edgar J. De Cleene edgardec2005@gmail.com wrote:
The short aswer is Chis modified old code for swiki and now runs in newer as 3.6. The question is: What you wish ? If you wish run your own swiki I could made a 5.2 64 bit version of squearos.org , which is my living playground where I daily synch with real swiki and and all my stuff. Is VERY customizable with external html5,js,css and now with wasm.
User of squearos.org is visita without pass
The box wich serves squeakros.org is a old Pentium IV with Ubuntu Mate 16 some shrinked.
Edgar @morplenauta
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Useful random insult:- Hypnotized as a child and couldn't be woken.
squeak-dev@lists.squeakfoundation.org