I have an array of floats (some normal numbers, some nans) that is coming out of an apply on a pandas dataframe.
For some reason, numpy.isnan is failing on this arra
A great substitute for np.isnan() and pd.isnull() is
for i in range(0,a.shape[0]):
if(a[i]!=a[i]):
//do something here
//a[i] is nan
since only nan is not equal to itself.
np.isnan
can be applied to NumPy arrays of native dtype (such as np.float64):
In [99]: np.isnan(np.array([np.nan, 0], dtype=np.float64))
Out[99]: array([ True, False], dtype=bool)
but raises TypeError when applied to object arrays:
In [96]: np.isnan(np.array([np.nan, 0], dtype=object))
TypeError: ufunc 'isnan' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe''
Since you have Pandas, you could use pd.isnull instead -- it can accept NumPy arrays of object or native dtypes:
In [97]: pd.isnull(np.array([np.nan, 0], dtype=float))
Out[97]: array([ True, False], dtype=bool)
In [98]: pd.isnull(np.array([np.nan, 0], dtype=object))
Out[98]: array([ True, False], dtype=bool)
Note that None
is also considered a null value in object arrays.
On top of @unutbu answer, you could coerce pandas numpy object array to native (float64) type, something along the line
import pandas as pd
pd.to_numeric(df['tester'], errors='coerce')
Specify errors='coerce' to force strings that can't be parsed to a numeric value to become NaN. Column type would be dtype: float64
, and then isnan
check should work
Make sure you import csv file using Pandas
import pandas as pd
condition = pd.isnull(data[i][j])