I have an existing fact table and I need to modeling an OLAP cube using Mondrian Schema.
Fact table has several columns with the primary key of dimension table but unluckily there are row with dimension value "0" with the convention of "all"
Example with only one dimension: student_id:
| student_id | exams
+------------+-------------
| 0 | 23
| 20131 | 15
| 20168 | 2
| 20468 | 6
student table
id | name
-------+-----------
20131 | John
20168 | Daid
20468 | Peter
20629 | Paul
22344 | Micheal
My schema is:
<Schema name="students">
<Dimension type="StandardDimension" visible="true" name="StudentDimension">
<Hierarchy name="Student" visible="true" hasAll="true" primaryKey="id">
<Table name="student" schema="public" alias="" />
<Level name="Nome" visible="true" column="name" type="String" uniqueMembers="false" levelType="Regular" />
</Hierarchy>
</Dimension>
<Cube name="studentCube" visible="true" cache="true" enabled="true">
<Table name="students_cube" schema="public" />
<DimensionUsage source="StudentDimension" name="StudentUsage" visible="true" foreignKey="student_id" />
<Measure name="Exams" column="exams" datatype="Numeric" aggregator="sum" visible="true" />
</Cube>
</Schema>
My problem is that executing the query:
SELECT
{[Measures].[Exams]} ON COLUMNS,
{[StudentUsage.Student].[All StudentUsage.Student]} ON ROWS
FROM [studentCube]
I have the result "46": the sum of all the exams included the row with student_id = 0.
I'd like to exclude in the schema the measures associated to dimensions with value "0". Is it possible?