502
votes

I have constructed a condition that extract exactly one row from my data frame:

d2 = df[(df['l_ext']==l_ext) & (df['item']==item) & (df['wn']==wn) & (df['wd']==1)]

Now I would like to take a value from a particular column:

val = d2['col_name']

But as a result I get a data frame that contains one row and one column (i.e. one cell). It is not what I need. I need one value (one float number). How can I do it in pandas?

13
If you tried some of these answers but ended up with a SettingWithCopyWarning, you can take a look at this post for an explanation of the warning and possible workarounds/solutions. - cs95

13 Answers

611
votes

If you have a DataFrame with only one row, then access the first (only) row as a Series using iloc, and then the value using the column name:

In [3]: sub_df
Out[3]:
          A         B
2 -0.133653 -0.030854

In [4]: sub_df.iloc[0]
Out[4]:
A   -0.133653
B   -0.030854
Name: 2, dtype: float64

In [5]: sub_df.iloc[0]['A']
Out[5]: -0.13365288513107493
281
votes

These are fast access for scalars

In [15]: df = pandas.DataFrame(numpy.random.randn(5,3),columns=list('ABC'))

In [16]: df
Out[16]: 
          A         B         C
0 -0.074172 -0.090626  0.038272
1 -0.128545  0.762088 -0.714816
2  0.201498 -0.734963  0.558397
3  1.563307 -1.186415  0.848246
4  0.205171  0.962514  0.037709

In [17]: df.iat[0,0]
Out[17]: -0.074171888537611502

In [18]: df.at[0,'A']
Out[18]: -0.074171888537611502
166
votes

You can turn your 1x1 dataframe into a numpy array, then access the first and only value of that array:

val = d2['col_name'].values[0]
34
votes

Most answers are using iloc which is good for selection by position.

If you need selection-by-label loc would be more convenient.

For getting a value explicitly (equiv to deprecated df.get_value('a','A'))

# this is also equivalent to df1.at['a','A']
In [55]: df1.loc['a', 'A'] 
Out[55]: 0.13200317033032932
26
votes

I needed the value of one cell, selected by column and index names. This solution worked for me:

original_conversion_frequency.loc[1,:].values[0]

22
votes

It doesn't need to be complicated:

val = df.loc[df.wd==1, 'col_name'].values[0]
20
votes

It looks like changes after pandas 10.1/13.1

I upgraded from 10.1 to 13.1, before iloc is not available.

Now with 13.1, iloc[0]['label'] gets a single value array rather than a scalar.

Like this:

lastprice=stock.iloc[-1]['Close']

Output:

date
2014-02-26 118.2
name:Close, dtype: float64
12
votes

The quickest/easiest options I have found are the following. 501 represents the row index.

df.at[501,'column_name']
df.get_value(501,'column_name')
6
votes

Not sure if this is a good practice, but I noticed I can also get just the value by casting the series as float.

e.g.

rate

3 0.042679

Name: Unemployment_rate, dtype: float64

float(rate)

0.0426789

5
votes

For pandas 0.10, where iloc is unavalable, filter a DF and get the first row data for the column VALUE:

df_filt = df[df['C1'] == C1val & df['C2'] == C2val]
result = df_filt.get_value(df_filt.index[0],'VALUE')

if there is more then 1 row filtered, obtain the first row value. There will be an exception if the filter result in empty data frame.

4
votes
df_gdp.columns

Index([u'Country', u'Country Code', u'Indicator Name', u'Indicator Code', u'1960', u'1961', u'1962', u'1963', u'1964', u'1965', u'1966', u'1967', u'1968', u'1969', u'1970', u'1971', u'1972', u'1973', u'1974', u'1975', u'1976', u'1977', u'1978', u'1979', u'1980', u'1981', u'1982', u'1983', u'1984', u'1985', u'1986', u'1987', u'1988', u'1989', u'1990', u'1991', u'1992', u'1993', u'1994', u'1995', u'1996', u'1997', u'1998', u'1999', u'2000', u'2001', u'2002', u'2003', u'2004', u'2005', u'2006', u'2007', u'2008', u'2009', u'2010', u'2011', u'2012', u'2013', u'2014', u'2015', u'2016'], dtype='object')

df_gdp[df_gdp["Country Code"] == "USA"]["1996"].values[0]

8100000000000.0

1
votes

Converting it to integer worked for me:

int(sub_df.iloc[0])
0
votes

To get the full row's value as JSON (instead of a Serie):

row = df.iloc[0]

Use the to_json method like bellow:

row.to_json()