-
Notifications
You must be signed in to change notification settings - Fork 135
Replies: 1 comment · 5 replies
-
You can't use the
|
Beta Was this translation helpful? Give feedback.
All reactions
-
Hi, @erikvansebille. Thanks for your kind reply. Based on your suggestions, I reworked the code and ran it again, but the problem is still not resolved. 1.When I use "print" -statements in my Kernels, it raises the "RuntimeError", which shows below:RuntimeError Traceback (most recent call last) File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\particleset\baseparticleset.py:433, in BaseParticleSet.execute(self, pyfunc, pyfunc_inter, endtime, runtime, dt, moviedt, recovery, output_file, movie_background_field, verbose_progress, postIterationCallbacks, callbackdt) File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\particleset\particlesetsoa.py:715, in ParticleSetSOA.Kernel(self, pyfunc, c_include, delete_cfiles) File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\kernel\basekernel.py:323, in BaseKernel.from_list(cls, fieldset, ptype, pyfunc_list, *args, **kwargs) File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\kernel\kernelsoa.py:98, in KernelSOA.init(self, fieldset, ptype, pyfunc, funcname, funccode, py_ast, funcvars, c_include, delete_cfiles) File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\compilation\codegenerator.py:526, in AbstractKernelGenerator.generate(self, py_ast, funcvars) File ~\anaconda3\envs\py3_parcels\lib\ast.py:418, in NodeVisitor.visit(self, node) File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\compilation\codegenerator.py:926, in ArrayKernelGenerator.visit_FunctionDef(self, node) File ~\anaconda3\envs\py3_parcels\lib\ast.py:418, in NodeVisitor.visit(self, node) File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\compilation\codegenerator.py:644, in AbstractKernelGenerator.visit_Expr(self, node) File ~\anaconda3\envs\py3_parcels\lib\ast.py:418, in NodeVisitor.visit(self, node) File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\compilation\codegenerator.py:599, in AbstractKernelGenerator.visit_Call(self, node) RuntimeError: This print statement is not supported in Python3 version of Parcels 2.When I use "from parcels.tools.loggers import logger" and try to run in Scipy-mode, it results in a "KernelError", with the following details:KernelError Traceback (most recent call last) File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\particleset\baseparticleset.py:548, in BaseParticleSet.execute(self, pyfunc, pyfunc_inter, endtime, runtime, dt, moviedt, recovery, output_file, movie_background_field, verbose_progress, postIterationCallbacks, callbackdt) File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\kernel\kernelsoa.py:213, in KernelSOA.execute(self, pset, endtime, dt, recovery, output_file, execute_once) File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\tools\statuscodes.py:121, in recovery_kernel_error(particle, fieldset, time) KernelError: 0 |
Beta Was this translation helpful? Give feedback.
All reactions
-
Hi @Cyuan-hub, can you
|
Beta Was this translation helpful? Give feedback.
All reactions
-
Hi, @erikvansebille, thanks for the quick response. 1.The code of my kernel where I added the print-statements for JIT is shows below:def DeleteParticle(particle, fieldset, time): def Ageing(particle, fieldset, time): def AdvectionRK4(particle, fieldset, time): def BeachTesting_Adv(particle, fieldset, time): def StokesDrag(particle, fieldset, time): def Beaching_StkDr(particle, fieldset, time): 2.When I use ''import logger'' in Scipy mode, it results in a "ModuleNotFoundError", with the following details:ModuleNotFoundError Traceback (most recent call last) ModuleNotFoundError: No module named 'logger' |
Beta Was this translation helpful? Give feedback.
All reactions
-
OK, the two answers below
|
Beta Was this translation helpful? Give feedback.
All reactions
-
Hi @erikvansebille, apologies for the delay in responding; I got held up with some things here.1.I have attempted to use simple print-statements for JIT mode. Currently, I'm only employing a basic kernel to test the print-statements. Below are the modified sections of my kernel where I added the print-statements for JIT:def DeleteParticle(particle, fieldset, time): def AdvectionRK4(particle, fieldset, time): Kernels= [AdvectionRK4] pset.repeatdt = None pset.execute(Kernels, However, upon inspection, I noticed that the results did not include any printed output from the print-statements. The specific output information is as follows:INFO: Compiled ArrayCustomParticleAdvectionRK4 ==> C:\Users\Lenovo\AppData\Local\Temp\parcels-tmp\lib02ad6336bab067f80c8fb409f77caa46_0.dll 2.When I use "import logging" in Scipy mode, it yields a "KernelError: 0" along with an 'Error: name 'logging' is not defined'. The detailed error information is as follows:KernelError Traceback (most recent call last) File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\particleset\baseparticleset.py:548, in BaseParticleSet.execute(self, pyfunc, pyfunc_inter, endtime, runtime, dt, moviedt, recovery, output_file, movie_background_field, verbose_progress, postIterationCallbacks, callbackdt) File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\kernel\kernelsoa.py:213, in KernelSOA.execute(self, pset, endtime, dt, recovery, output_file, execute_once) File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\tools\statuscodes.py:121, in recovery_kernel_error(particle, fieldset, time) KernelError: 0 The version of Python I'm using is Python 3.10.12. It's puzzling that the error 'name 'logging' is not defined' still occurs when importing the built-in logging module.
|
Beta Was this translation helpful? Give feedback.
-
Hello everyone!
Sorry in advance if that is already documented, I went through the documentation but did not find any clear guidance.
I am using Python version 3.10.12 and ocean parcels version 2.4.2. I want to add logging to Parcels' kernel functions for debugging purposes. However, when I use "logger" imported from parcels, it encounters some errors. Here are the details:
The code that I use is as follow:
from parcels import FieldSet, Field, NestedField, VectorField, ParticleFile, ParticleSet, JITParticle, Variable
from parcels import ErrorCode, logger
import glob
import xarray as xr
import numpy as np
from operator import attrgetter
from parcels import ParcelsRandom as random
import math
import zarr
from datetime import datetime, timedelta
import calendar
import glob
startyear = 2020
endyear = 2020
day_month = [31,28,31,30,31,30,31,31,30,31,30,31]
time_stamps = []
for year in range(startyear, endyear + 1):
if calendar.isleap(year):
day_month[1] = 29
else:
day_month[1] = 28
for month in range(1,13):
if year == 2020 and month == 6:
for day in range(1, day_month[month - 1] + 1):
dateone = np.datetime64('%d-%.2d-%.2d' % (year, month, day)) + np.timedelta64(12, 'h')
time_stamps.append(dateone)
timestamps = np.expand_dims(np.array(time_stamps), axis=1)
vel_path = 'G:/JupyterNotebook/SCS_case2/input/Datasets/SurfaceCurrents/test_for_beach/'
vel_files = vel_path + '*_TotalDailyMean.nc'
filenames = {'U': vel_files, 'V': vel_files,}
variables = {"U": "vo", "V": "uo",}
dimensions = {'U': {'lat': 'latitude', 'lon': 'longitude'},
'V': {'lat': 'latitude', 'lon': 'longitude'},}
fieldset = FieldSet.from_netcdf(filenames, variables, dimensions, mesh="spherical",
timestamps = timestamps,allow_time_extrapolation= True) #False
from glob import glob
data_dir = 'G:/JupyterNotebook/Data-Download-from-Web/1-5degree/test_for_beach/'
fnames = []
basepath = data_dir + '*_Wave_Stokes_drift.nc'
fnames += sorted(glob(str(basepath)))
dimensionsU = {'lon': 'longitude', 'lat': 'latitude'} #, 'time': 'time'
dimensionsV = {'lon': 'longitude', 'lat': 'latitude'}
Uuss = Field.from_netcdf(fnames, ('Uuss', 'VSdx'), dimensionsU, fieldtype='U', timestamps = timestamps, allow_time_extrapolation=True,)
Vuss = Field.from_netcdf(fnames, ('Vuss', 'VSdy'), dimensionsV, fieldtype='V', timestamps = timestamps,
allow_time_extrapolation=True,
grid=Uuss.grid, dataFiles=Uuss.dataFiles)
fieldset.add_field(Uuss)
fieldset.add_field(Vuss)
uv_uss = VectorField('UVuss', fieldset.Uuss, fieldset.Vuss)
fieldset.add_vector_field(uv_uss)
coast_dist_file = 'G:/JupyterNotebook/SCS_case2/input/distances_to_nearest_coastline_matlab_Cy.nc'
data_dist = xr.open_dataset(coast_dist_file)
coast_distance = data_dist['distance_to_coast']
lon_dist = coast_distance.longitude.values
lat_dist = coast_distance.latitude.values
depth_dist = None
coast_dist_field = Field('dist', coast_distance.values, lon=lon_dist, lat=lat_dist, depth=depth_dist)
fieldset.add_field(coast_dist_field)
landmaskFile = 'G:/JupyterNotebook/SCS_case2/input/landmask_0ocean.nc'
data_mask = xr.open_dataset(landmaskFile)
LandMask = data_mask['mask']
lon_mask = LandMask.longitude.values
lat_mask = LandMask.latitude.values
depth_mask = None
LandMask = Field('LandMask', LandMask.values, lon=lon_mask, lat=lat_mask, depth=depth_mask)
fieldset.add_field(LandMask)
class CustomParticle(JITParticle):
age = Variable('age', dtype=np.float32, initial=0.)
particle_state = Variable('particle_state', dtype=np.int32, initial=0.)
unbeachCount = Variable('unbeachCount', dtype=np.int32, initial=0.)
beachCount = Variable('beachCount', dtype=np.int32, initial=0.)
beached_lon = Variable('beached_lon', dtype=np.float64, to_write= False)
beached_lat = Variable('beached_lat', dtype=np.float64, to_write= False)
beached_time = Variable('beached_time', dtype=np.float64, to_write=True, initial=0.)
time_on_land = Variable('time_on_land', dtype=np.float64, to_write= False, initial=0.)
prev_lon = Variable('prev_lon', dtype=np.float32, to_write=True, initial=attrgetter('lon')) # the previous longitude in water
prev_lat = Variable('prev_lat', dtype=np.float32, to_write=True, initial=attrgetter('lat')) # the previous latitude in water
# prev_time = Variable('prev_time',initial = attrgetter('time'),to_write = False)
position_file = 'G:/JupyterNotebook/SCS_case2/input/Particle_InitialPosi/'
lon_file = 'Lons_for_releasePoints.txt'
lat_file = 'Lats_for_releasePoints.txt'
lons = []
lats = []
with open(position_file+lon_file,'r',encoding = 'utf-8') as flon :
londata = flon.readlines()
for i in range(0,len(londata)):
lons.append(londata[i][0:-1])
with open(position_file+lat_file,'r',encoding = 'utf-8') as flat :
latdata = flat.readlines()
for i in range(0,len(latdata)):
lats.append(latdata[i][0:-1])
particles_lon = np.array(lons)
particles_lat = np.array(lats)
release_depth = np.zeros(particles_lon.shape)
release_interval = timedelta(hours = 24) # release from the same set of locations every 30days
pset = ParticleSet.from_list(fieldset,
pclass = CustomParticle,
lon = particles_lon,
lat = particles_lat,
depth = release_depth,
repeatdt = release_interval)
def DeleteParticle(particle, fieldset, time):
particle.delete()
logger.info_once(f"Particle {particle.id} is deleted at ({particle.lon}, {particle.lat}) at time {time}.")
def Ageing(particle, fieldset, time):
particle.age += particle.dt
logger.info_once(f"Particle {particle.id} is aging. New age: {particle.age}")
def AdvectionRK4(particle, fieldset, time):
logger.info_once(f"Particle {particle.id} is advecting at time {time}.")
if particle.particle_state == 0:
(u1, v1) = fieldset.UV[time, particle.depth, particle.lat, particle.lon]
lon1_adv, lat1_adv = (particle.lon + u1*.5particle.dt, particle.lat + v1.5particle.dt)
(u2, v2) = fieldset.UV[time + .5 * particle.dt, particle.depth, lat1_adv, lon1_adv]
lon2_adv, lat2_adv = (particle.lon + u2.5particle.dt, particle.lat + v2.5particle.dt)
(u3, v3) = fieldset.UV[time + .5 * particle.dt, particle.depth, lat2_adv, lon2_adv]
lon3_adv, lat3_adv = (particle.lon + u3particle.dt, particle.lat + v3particle.dt)
(u4, v4) = fieldset.UV[time + particle.dt, particle.depth, lat3_adv, lon3_adv]
particle.lon += (u1 + 2u2 + 2u3 + u4) / 6. * particle.dt
particle.lat += (v1 + 2v2 + 2*v3 + v4) / 6. * particle.dt
particle.particle_state = 2
logger.warning_once("This is a warning message that will be displayed only once.")
def BeachTesting_Adv(particle, fieldset, time):
if particle.particle_state == 2:
part_dis_to_coast1 = fieldset.dist[particle.dt, 0, particle.lat, particle.lon]
beaching_dist1 = 0.1
if part_dis_to_coast1 <= beaching_dist1:
particle.lon = particle.prev_lon
particle.lat = particle.prev_lat
particle.particle_state = 0
particle.unbeachCount += 1
logger.info_once(f"Particle {particle.id} is returning to water after beach testing.")
else:
particle.particle_state = 0
particle.prev_lon = particle.lon
particle.prev_lat = particle.lat
logger.info_once(f"Particle {particle.id} is continuing advection after beach testing.")
def StokesDrag(particle, fieldset, time):
if particle.particle_state == 0:
(u_uss, v_uss) = fieldset.UVuss[time, particle.depth, particle.lat, particle.lon]
particle.lon += u_uss * particle.dt
particle.lat += v_uss * particle.dt
particle.particle_state = 3
logger.info_once(f"Particle {particle.id} is experiencing Stokes drag at time {time}.")
def Beaching_StkDr(particle, fieldset, time):
if particle.particle_state == 3:
part_dis_to_coast2 = fieldset.dist[particle.dt, 0, particle.lat, particle.lon]
beaching_dist2 = 0.1
if part_dis_to_coast2 <= beaching_dist2:
particle.particle_state = 1
particle.delete()
logger.info_once(f"Particle {particle.id} is beaching due to Stokes drag at time {time}.")
else:
particle.particle_state = 0
particle.prev_lon = particle.lon
particle.prev_lat = particle.lat
logger.info_once(f"Particle {particle.id} is not beaching after experiencing Stokes drag at time {time}.")
Kernels= [AdvectionRK4,BeachTesting_Adv,StokesDrag,Beaching_StkDr,Ageing]
out_interval = timedelta(days = 1)
new_output_directory = "G:/JupyterNotebook/SCS_case2/ParcelsOutput/ZarrData/new/"
output_dirstore_name = new_output_directory + "Adv-loggerTesting.zarr/"
output_dirstore = zarr.storage.DirectoryStore(output_dirstore_name)
outputfile = pset.ParticleFile(name=output_dirstore, outputdt = out_interval)
dt = timedelta(minutes = 30)
totaltime_for_running = timedelta(days = 30)
running_time_first = timedelta(days = 10)
pset.execute(Kernels,
runtime = running_time_first,
dt = dt,
output_file = outputfile,
recovery = {ErrorCode.ErrorOutOfBounds: DeleteParticle,
ErrorCode.ErrorTimeExtrapolation: DeleteParticle})
pset.repeatdt = None
pset.execute(Kernels,
runtime = timedelta(days = 20),
dt = dt,
output_file = outputfile,
recovery = {ErrorCode.ErrorOutOfBounds: DeleteParticle,
ErrorCode.ErrorTimeExtrapolation: DeleteParticle})
I am getting the following error while I run the above codes:
NotImplementedError Traceback (most recent call last)
Cell In[4], line 17
14 totaltime_for_running = timedelta(days = 30)
15 running_time_first = timedelta(days = 10)
---> 17 pset.execute(Kernels,
18 runtime = running_time_first,
19 dt = dt,
20 output_file = outputfile,
21 recovery = {ErrorCode.ErrorOutOfBounds: DeleteParticle,
22 ErrorCode.ErrorTimeExtrapolation: DeleteParticle})
24 pset.repeatdt = None
26 pset.execute(Kernels,
27 runtime = timedelta(days = 20),
28 dt = dt,
29 output_file = outputfile,
30 recovery = {ErrorCode.ErrorOutOfBounds: DeleteParticle,
31 ErrorCode.ErrorTimeExtrapolation: DeleteParticle})
File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\particleset\baseparticleset.py:433, in BaseParticleSet.execute(self, pyfunc, pyfunc_inter, endtime, runtime, dt, moviedt, recovery, output_file, movie_background_field, verbose_progress, postIterationCallbacks, callbackdt)
431 self.kernel = pyfunc
432 else:
--> 433 self.kernel = self.Kernel(pyfunc)
434 # Prepare JIT kernel execution
435 if self.collection.ptype.uses_jit:
File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\particleset\particlesetsoa.py:715, in ParticleSetSOA.Kernel(self, pyfunc, c_include, delete_cfiles)
698 """Wrapper method to convert a
pyfunc
into a :class:parcels.kernel.Kernel
object.699
700 Conversion is based on
fieldset
andptype
of the ParticleSet.(...)
712 (Default value = "")
713 """
714 if isinstance(pyfunc, list):
--> 715 return Kernel.from_list(
716 self.fieldset,
717 self.collection.ptype,
718 pyfunc,
719 c_include=c_include,
720 delete_cfiles=delete_cfiles,
721 )
722 return Kernel(
723 self.fieldset,
724 self.collection.ptype,
(...)
727 delete_cfiles=delete_cfiles,
728 )
File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\kernel\basekernel.py:323, in BaseKernel.from_list(cls, fieldset, ptype, pyfunc_list, *args, **kwargs)
320 raise ValueError("Argument function_lst should be a list of functions.")
322 pyfunc_list = pyfunc_list.copy()
--> 323 pyfunc_list[0] = cls(fieldset, ptype, pyfunc_list[0], *args, **kwargs)
324 return functools.reduce(lambda x, y: x + y, pyfunc_list)
File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\kernel\kernelsoa.py:98, in KernelSOA.init(self, fieldset, ptype, pyfunc, funcname, funccode, py_ast, funcvars, c_include, delete_cfiles)
96 if self.ptype.uses_jit:
97 kernelgen = KernelGenerator(fieldset, ptype)
---> 98 kernel_ccode = kernelgen.generate(deepcopy(self.py_ast),
99 self.funcvars)
100 self.field_args = kernelgen.field_args
101 self.vector_field_args = kernelgen.vector_field_args
File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\compilation\codegenerator.py:520, in AbstractKernelGenerator.generate(self, py_ast, funcvars)
517 def generate(self, py_ast, funcvars):
518 # Replace occurences of intrinsic objects in Python AST
519 transformer = IntrinsicTransformer(self.fieldset, self.ptype)
--> 520 py_ast = transformer.visit(py_ast)
522 # Untangle Pythonic tuple-assignment statements
523 py_ast = TupleSplitter().visit(py_ast)
File ~\anaconda3\envs\py3_parcels\lib\ast.py:418, in NodeVisitor.visit(self, node)
416 method = 'visit_' + node.class.name
417 visitor = getattr(self, method, self.generic_visit)
--> 418 return visitor(node)
File ~\anaconda3\envs\py3_parcels\lib\ast.py:494, in NodeTransformer.generic_visit(self, node)
492 for value in old_value:
493 if isinstance(value, AST):
--> 494 value = self.visit(value)
495 if value is None:
496 continue
File ~\anaconda3\envs\py3_parcels\lib\ast.py:418, in NodeVisitor.visit(self, node)
416 method = 'visit_' + node.class.name
417 visitor = getattr(self, method, self.generic_visit)
--> 418 return visitor(node)
File ~\anaconda3\envs\py3_parcels\lib\ast.py:503, in NodeTransformer.generic_visit(self, node)
501 old_value[:] = new_values
502 elif isinstance(old_value, AST):
--> 503 new_node = self.visit(old_value)
504 if new_node is None:
505 delattr(node, field)
File ~\anaconda3\envs\py3_parcels\lib\ast.py:418, in NodeVisitor.visit(self, node)
416 method = 'visit_' + node.class.name
417 visitor = getattr(self, method, self.generic_visit)
--> 418 return visitor(node)
File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\compilation\codegenerator.py:414, in IntrinsicTransformer.visit_Call(self, node)
413 def visit_Call(self, node):
--> 414 node.func = self.visit(node.func)
415 node.args = [self.visit(a) for a in node.args]
416 node.keywords = {kw.arg: self.visit(kw.value) for kw in node.keywords}
File ~\anaconda3\envs\py3_parcels\lib\ast.py:418, in NodeVisitor.visit(self, node)
416 method = 'visit_' + node.class.name
417 visitor = getattr(self, method, self.generic_visit)
--> 418 return visitor(node)
File ~\anaconda3\envs\py3_parcels\lib\site-packages\parcels\compilation\codegenerator.py:332, in IntrinsicTransformer.visit_Attribute(self, node)
328 raise NotImplementedError("Cannot convert random functions in kernels to C-code.\n"
329 "Use
import parcels.rng as ParcelsRandom
and then ParcelsRandom.random(), ParcelsRandom.uniform() etc.\n"330 "For more information, see http://oceanparcels.org/faq.html#kernelwriting")
331 else:
--> 332 raise NotImplementedError(f"Cannot convert '{node.value.id}' used in kernel to C-code")
NotImplementedError: Cannot convert 'logger' used in kernel to C-code
Any assistance or guidance on how to properly incorporate logging into Parcels' kernel functions would be greatly appreciated. Thank you in advance for your help!
Beta Was this translation helpful? Give feedback.
All reactions