1
votes

I get the following runtime error while performing operations like add() and combine_first() on large dataframes:

ValueError: operands could not be broadcast together with shapes (680,) (10411,)

Broadcasting errors seem to happen quite often using Numpy (matrix dimensions mismatch), however I do not understand why it does effect my multiindex dataframes / series. Each of the concat-elements produces a runtime error:

My code:

# I want to merge two dataframes data1 and data2
# add up the 'requests' column
# merge 'begin' column choosing data1-entries first on collision
# merge 'end' column choosing data2-entries first on collision

pd.concat([\
    data1["begin"].combine_first(data2["begin"]),\
    data2["end"].combine_first(data1["end"]),\
    data1["requests"].add(data2["requests"], fill_value=0)\
    ], axis=1)

My data:

# data1
                           requests               begin                 end
IP              sessionID
*1.*16.*01.5*   20                9 2011-12-16 13:06:23 2011-12-16 16:50:57
                21                3 2011-12-17 11:46:26 2011-12-17 11:46:29
                22               15 2011-12-19 10:10:14 2011-12-19 16:10:47
                23                9 2011-12-20 09:11:23 2011-12-20 13:01:12
                24                9 2011-12-21 00:15:22 2011-12-21 02:50:22
...
6*.8*.20*.14*   6283              1 2011-12-25 01:35:25 2011-12-25 01:35:25
20*.11*.3.10*   6284              1 2011-12-25 01:47:45 2011-12-25 01:47:45

[680 rows x 3 columns]

# data2
                           requests               begin                 end
IP              sessionID                                                  
*8.24*.135.24*  9215              1 2011-12-29 03:14:10 2011-12-29 03:14:10
*09.2**.22*.4*  9216              1 2011-12-29 03:14:38 2011-12-29 03:14:38
*21.14*.2**.22* 9217             12 2011-12-29 03:16:06 2011-12-29 03:19:45 
...
19*.8*.2**.1*1  62728             2 2012-03-31 11:08:47 2012-03-31 11:08:47
6*.16*.10*.155  77282             1 2012-03-31 11:19:33 2012-03-31 11:19:33
17*.3*.18*.6*   77305             1 2012-03-31 11:55:52 2012-03-31 11:55:52
6*.6*.2*.20*    77308             1 2012-03-31 11:59:05 2012-03-31 11:59:05

[10411 rows x 3 columns] 
3

3 Answers

2
votes

I don't know why, maybe it is a bug or something, but stating explicitly to use all rows from each series with [:] works as expected. No errors.

print pd.concat([\
    data1["begin"][:].combine_first(data2["begin"][:]),\
    data2["end"][:].combine_first(data1["end"][:]),\
    data1["requests"][:].add(data2["requests"][:], fill_value=0)\
    ], axis=1)
0
votes

It looks that when you do data1["requests"].add(data2["requests"], fill_value=0) you are trying to sum 2 pandas Series with different size of rows. Series.add will broadcast the add operation to all elements in both series and this imply same dimension.

0
votes

Use the numpy.concatenate((df['col1', df['col2']), axis=None)) works.