On 10/23/10 7:44 AM, Hartmut Kaiser wrote:
On Friday 22 October 2010 11:29:07 Joel de Guzman wrote:
On 10/22/10 4:17 PM, Thomas Heller wrote:
On Friday 22 October 2010 09:58:25 Eric Niebler wrote:
On 10/22/2010 12:33 AM, Thomas Heller wrote:
On Friday 22 October 2010 09:15:47 Eric Niebler wrote:
On 10/21/2010 7:09 PM, Joel de Guzman wrote:
Check out the doc I sent (Annex A). It's really, to my mind,
generic languages -- abstraction of rules and templated grammars
through metanotions and hyper-rules.

Parameterized rules. Yes, I can understand that much. My
understanding stops when I try to imagine how to build a parser
that recognizes a grammar with parameterized rules.

And I can't understand how expression templates relate to parsing.

It doesn't in any practical sense, really. No parsing ever happens
in Proto. The C++ compiler parses expressions for us and builds the
tree.
Proto grammars are patterns that match trees. (It is in this sense
they're closer to schemata, not grammars that drive parsers.)

They're called "grammars" in Proto not because they drive the
parsing but because they describe the valid syntax for your embedded
language.

Ok, this formulation makes it much clearer :)

It's just the metaphor! And what I saying is that you will get into
confusion land if you mix metaphors from different domains. Proto uses
the parsing domain and it makes sense (*). It may (and I say may) be
possible to extend that metaphor and in the end it may be possible to
incorporate that into proto instead of phoenix (if it is indeed
conceptually understandable and reusable) --an opportunity that may be
missed if you shut the door and dismiss the idea prematurely.

It is OK to switch metaphors and have a clean cut. But again, my point
is: use only one metaphor. Don't mix and match ad-hoc.

(* regardless if it doesn't do any parsing at all!)

I have this strong feeling that that's the intent of Thomas and
your recent designs. Essentially, making the phoenix language a
metanotion in itself that can be extended post-hoc through
generic means.

I don't think that's what Thomas and I are doing. vW-grammars
change the descriptive power of grammars. But we don't need more
descriptive grammars. Thomas and I aren't changing the grammar of
Phoenix at all. We're just plugging in different actions. The
grammar is unchanged.

Exactly.
Though, I think this is the hard part to wrap the head around. We
have a grammar, and this very same grammar is used to describe
"visitation".

It's for the same reason that grammars are useful for validating
expressions that they are also useful for driving tree traversals:
pattern matching. There's no law that the /same/ grammar be used
for validation and evaluation. In fact, that's often not the case.

True.
However it seems convenient to me reusing the grammar you wrote for
validating your language for the traversal of an expression matching
that grammar.
This is what we tried with this rule based dispatching to Semantic
Actions. I am currently thinking in another direction, that is
separating traversal and grammar again, very much like proto
contexts, but with this rule dispatching and describing it with
proto transforms ... the idea is slowly materializing in my head ...

Again I should warn against mixing metaphors. IMO, that is the basic
problem why it is so deceptively unclear. There's no clear model that
conceptualizes all this, and thus no way to reason out on an abstract
level. Not good.

Alright, I think mixing metaphors is indeed a very bad idea.
IMHO, it is best to stay in the grammar with semantic actions domain as it
always(?) has been.
I broke my head today, and developed a solution which stays in these very
same proto semantics, and reuses "keywords" (more on that) already in
proto.
Attached, you will find the implementation of this very idea.

So, semantic actions in are kind of simple right now. the work by having
    proto::when<some_grammar_rule, some_transform>  They ultimatevily bind
this specific transform to that specific grammar.

The solution attached follows the same principles. However, grammars and
transform can now be decoupled. Transforms are looked up by rules that
define the grammar. In order to transform an Expression by this new form
of semantic actions, a special type of transform has to be used which i
call "traverse". This transform is parametrized by the grammar, which
holds the rules, and the Actions which holds the actions.
The action lookup is done by the following rules (the code might differ
from the description, this is considered a bug in the code):
    1) Peel of grammar constructs like or_, and_ and switch.
    2) Look into this inner most layer of the grammar and see if the
supplied actions implement a action for that rule. If this rule also
matches the current expression, that action is returned.
    3) If no action was found, just return the expression itself.

The handling of these rules is completely implemented in the
_traverse<Grammar, Actions>  transform. This leaves the grammar what it is,
a grammar. Making it possible to reuse it with a set of transforms.
This set of transforms is called Actions.

An Actions class has a nested struct when<Rule, Actions>. This resembles
the behavior of the already existing proto::when.

Example (from above):

struct some_actions
     : actions<some_actions>
{
      template<typename Actions>
      struct when<some_grammar_rule, Actions>  : some_transform {}; };

traverse<some_grammar, some_actions>(expr);

With expression being an AST conforming to some_grammar.

I think this is the simplification of client proto code we searched for.
It probably needs some minor polishment though.

Thoughts?

Excellent thinking! I'm amazed!

This is the best extension interface I've seen so far. If we were
to have Spirit3, I'd like to use this extension interface.

Regards,
--
Joel de Guzman
http://www.boostpro.com
http://spirit.sf.net



_______________________________________________
proto mailing list
proto@lists.boost.org
http://lists.boost.org/mailman/listinfo.cgi/proto

Reply via email to