Segmentation fault writing xarray datset to netcdf or dataframe

走远了吗. 提交于 2019-12-25 06:24:29

问题


I get a segmentation fault working with a xarray dataset that was created from multiple grib2 files. The fault occurs when writing out to a netcdf as well as when writing to a dataframe. Any suggestions on what is going wrong are appreciated.

files = os.listdir(download_dir)

Example of files (from http://dd.weather.gc.ca/model_hrdps/west/grib2/00/000/) 'CMC_hrdps_west_RH_TGL_2_ps2.5km_2016072800_P015-00.grib2',... 'CMC_hrdps_west_TMP_TGL_2_ps2.5km_2016072800_P011-00.grib2'

# import and combine all grib2 files
ds = xr.open_mfdataset(files,concat_dim='time',engine='pynio')

<xarray.Dataset>
Dimensions:    (time: 48, xgrid_0: 685, ygrid_0: 485)
Coordinates:
    gridlat_0  (ygrid_0, xgrid_0) float32 44.6896 44.6956 44.7015 44.7075 ...
  * ygrid_0    (ygrid_0) int64 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 ...
  * xgrid_0    (xgrid_0) int64 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 ...
  * time       (time) datetime64[ns] 2016-07-28T01:00:00 2016-07-28T02:00:00 ...
    gridlon_0  (ygrid_0, xgrid_0) float32 -129.906 -129.879 -129.851 ...
Data variables:
    u          (time, ygrid_0, xgrid_0) float64 nan nan nan nan nan nan nan ...
    gridrot_0  (time, ygrid_0, xgrid_0) float32 nan nan nan nan nan nan nan ...
    Qli        (time, ygrid_0, xgrid_0) float64 nan nan nan nan nan nan nan ...
    Qsi        (time, ygrid_0, xgrid_0) float64 nan nan nan nan nan nan nan ...
    p          (time, ygrid_0, xgrid_0) float64 nan nan nan nan nan nan nan ...
    rh         (time, ygrid_0, xgrid_0) float64 nan nan nan nan nan nan nan ...
    press      (time, ygrid_0, xgrid_0) float64 nan nan nan nan nan nan nan ...
    t          (time, ygrid_0, xgrid_0) float64 nan nan nan nan nan nan nan ...
    vw_dir     (time, ygrid_0, xgrid_0) float64 nan nan nan nan nan nan nan ...

Writing out to netcdf

ds.to_netcdf('test.nc')

Segmentation fault (core dumped)


回答1:


PyNIO doesn't play well with multithreading. Try adding lock=True to open_mfdataset (we should probably set this by default).

Try adding proprocess=lambda x: x.load() to the open_mfdataset call. This will ensure that each dataset is fully loaded into memory before processing the next one.



来源:https://stackoverflow.com/questions/38711915/segmentation-fault-writing-xarray-datset-to-netcdf-or-dataframe

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!