50
votes

Running Bison on this file:

%{
    #include <iostream>
    int yylex();
    void yyerror(const char*);
%}


%union
{
    char    name[100];
    int     val;
}

%token NUM ID
%right '='
%left '+' '-'
%left '*'

%%

exp :   NUM     {$$.val = $1.val;}
    | ID        {$$.val = vars[$1.name];}
    | exp '+' exp   {$$.val = $1.val + $3.val;}
    | ID '=' exp    {$$.val = vars[$1.name] = $3.val;}
;

%%

Leads to warnings of the kind of:

warning: $$ of 'exp' has no declared type.

What does it mean and how do I solve it?

2
+1: for appearing first when googling bison error has no declared typeINS
Just a small clarity. I have %union { int intValue; int floatValue; } but it does not allow me to use $$.intValue or $1.intValue. It says error: request for member ‘floatValue’ in something not a structure or union. Why so?Shashwat

2 Answers

48
votes

The union (%union) defined is not intended to be used directly. Rather, you need to tell Bison which member of the union is used by which expression.

This is done with the %type directive.

A fixed version of the code is:

%{
    #include <iostream>
    int yylex();
    void yyerror(const char*);
%}


%union
{
    char    name[100];
    int     val;
}

%token NUM ID
%right '='
%left '+' '-'
%left '*'

%type<val> exp NUM
%type<name> ID

%%

exp :   NUM     {$$ = $1;}
    | ID        {$$ = vars[$1];}
    | exp '+' exp   {$$ = $1 + $3;}
    | ID '=' exp    {$$ = vars[$1] = $3;}
;

%%
8
votes

As a further thought, if you want to be more explicit with your reductions (if you are doing AST annoation, this can be handy) then you can make your stack values pointers and then handle type values yourself. Much like scalar types with:

struct myScalar {
    union {
        int num;
        char *id;
        char *float_lexeme;
    }payload;

    enum {
        TYPE_NUM,
        TYPE_IDENTIFIER,
        TYPE_FLOAT_CHAR
    } type;
    char *orig_lexeme;
};

And have a typedef and scalar_val *val for the stack.

When you move onto more complex compiler front-ends, it can help to build your AST like this so that when you traverse the tree you have better meta-data and you can also augment the translation with translations for pre-semantic types. Then it boils down to your leaf productions such as ID to shuffle the lexeme into the right scalar payload.

Not a complete explanation, but you get the idea.

Hope this helps with your future Bison/Lex front-ends and ...

Good Luck