0
votes

I'm trying using the %timeit magic but it gives me a result that I don't want.

The code I'm running as ipython instance is:

import numpy as np
a = np.random.random((1024,1024))
b = np.random.random((1024,1024))
%timeit a+b

The output is: The slowest run took 4.09 times longer than the fastest. This could mean that an intermediate result is being cached. 100 loops, best of 3: 1.93 ms per loop

Normally in other computers I obtain the average time and the standard deviation, I'm interested in both these results but I cannot figure out how could I fix this. I tried looking around but nobody is using the magic but the library timeit. I'd like to how to change the default behavior of this magic

1

1 Answers

0
votes

I figured out that the reason I got this differences of outputs of %timeit in different machines depends on the version of python I was using. In python 2.7 it returns the output I was complaining... In python >3 it returns the average and standard deviation.

I answered my own question in the case someone was wondering something similar