In Dijkstra's algorithm, when you visit a node you know that there is no shorter road to this node. This is not true if you multiply the edges with weights between 0..1 as if you visit more vertices you will get a smaller number.
Basically this is equivalent of finding the longest path in a graph. This can be seen also by using your idea of taking logarithms, as the logarithm of a number between 0 and 1 is negative. If you take absolute values of the logarithms of the weights, the longest path corresponds to the shortest path in the multiplicative graph.
If your graph is acyclic there is a straightforward algorithm (modified from Longest path problem).
- Find a Topological ordering of the DAG.
- For each vertex you need to store the cost of path. Initialize this to one at the beginning.
- Travel through the DAG in topological order starting from your start vertex. In each vertex check all the children and if the cost is smaller than previously, update it. Store also the vertex where you arrive at this vertex with the lowest cost.
After you reach your final vertex, you can find the "shortest" path by travelling back from the end vertex using the stored vertices.
Of course, if you graph is not acyclic you can always reach a zero end cost by repeating a loop infinitely.