Derived Fields and Profiles¶
Derived Fields and Profiles
One of the most powerful features in yt is the ability to create derived fields that act and look exactly like fields that exist on disk. This means that they will be generated on demand and can be used anywhere a field that exists on disk would be used. Additionally, you can create them by just writing python functions.
from yt.imods import *
Derived Fields
This is an example of the simplest possible way to create a derived field. All derived fields are defined by a function and some metadata; that metadata can include units, LaTeX-friendly names, conversion factors, and so on. Fields can be defined in the way in the next cell. What this does is create a function which accepts two arguments and then provide the units for that field. In this case, our field is Dinosaurs
and our units are Trex/s
. The function itself can access any fields that are in the simulation, and it does so by requesting data from the object called data
.
@derived_field(units = "Trex/s")
def Dinosaurs(field, data):
return data["Density"]**(2.0/3.0) * data["VelocityMagnitude"]
One important thing to note is that derived fields must be defined before any datasets are loaded. Let's load up our data and take a look at some quantities.
pf = load("IsolatedGalaxy/galaxy0030/galaxy0030")
dd = pf.h.all_data()
print dd.quantities.keys()
One interesting question is, what are the minimum and maximum values of dinosaur production rates in our isolated galaxy? We can do that by examining the Extrema
quantity -- the exact same way that we would for Density, Temperature, and so on.
print dd.quantities["Extrema"]("Dinosaurs")
We can do the same for the average quantities as well.
print dd.quantities["WeightedAverageQuantity"]("Dinosaurs", weight="Temperature")
A Few Other Quantities
We can ask other quantities of our data, as well. For instance, this sequence of operations will find the most dense point, center a sphere on it, calculate the bulk velocity of that sphere, calculate the baryonic angular momentum vector, and then the density extrema. All of this is done in a memory conservative way: if you have an absolutely enormous dataset, yt will split that dataset into pieces, apply intermediate reductions and then a final reduction to calculate your quantity.
sp = pf.h.sphere("max", (10.0, 'kpc'))
bv = sp.quantities["BulkVelocity"]()
L = sp.quantities["AngularMomentumVector"]()
(rho_min, rho_max), = sp.quantities["Extrema"]("Density")
print bv, L, rho_min, rho_max
Profiles
yt provides the ability to bin in 1, 2 and 3 dimensions. This means discretizing in one or more dimensions of phase space (density, temperature, etc) and then calculating either the total value of a field in each bin or the average value of a field in each bin.
We do this using the objects BinnedProfile1D
, BinnedProfile2D
, and BinnedProfile3D
. The first two are the most common since they are the easiest to visualize.
This first set of commands manually creates a BinnedProfile1D
from the sphere we created earlier, binned in 32 bins according to density between rho_min
and rho_max
, and then takes the Density-weighted average of the fields Temperature
and (previously-defined) Dinosaurs
. We then plot it in a loglog plot.
prof = BinnedProfile1D(sp, 32, "Density", rho_min, rho_max)
prof.add_fields(["Temperature", "Dinosaurs"], weight="Density")
pylab.loglog(prof["Density"], prof["Temperature"], "-x")
Now we plot the Dinosaurs
field.
pylab.loglog(prof["Density"], prof["Dinosaurs"], '-x')
If we want to see the total mass in every bin, we add the CellMassMsun
field with no weight. Specifying weight=None
will simply take the total value in every bin and add that up.
prof.add_fields(["CellMassMsun"], weight=None)
pylab.loglog(prof["Density"], prof["CellMassMsun"])
We can also specify accumulation, which sums all the bins, from left to right. Note that for 2D and 3D profiles, this needs to be a tuple of length 2 or 3.
prof.add_fields(["CellMassMsun"], weight=None, accumulation=True)
pylab.loglog(prof["Density"], prof["CellMassMsun"])
Advanced Derived Fields
This section can be skipped!
You can also define fields that require extra zones. This is useful, for instance, if you want to take the average, or apply a stencil. yt provides fields like DivV
that do this internally. This example is a very busy example of how to do it. You need to specify the validator ValidateSpatial
with the number of extra zones on each side of the grid that you need, and then inside your function you need to return a field with those zones stripped off. So by necessity, the arrays returned by data[something]
will have larger spatial extent than what should be returned by the function itself. If you specify that you need 0 extra zones, this will also work and will simply supply a grid
object for the field.
@derived_field(name = "AveragedTemperature",
validators = [ValidateSpatial(1)],
units = r"K")
def _AveragedTemperature(field, data):
nx, ny, nz = data["Temperature"].shape
new_field = na.zeros((nx-2,ny-2,nz-2), dtype='float64')
weight_field = na.zeros((nx-2,ny-2,nz-2), dtype='float64')
i_i, j_i, k_i = na.mgrid[0:3,0:3,0:3]
for i,j,k in zip(i_i.ravel(),j_i.ravel(),k_i.ravel()):
sl = [slice(i,nx-(2-i)),slice(j,ny-(2-j)),slice(k,nz-(2-k))]
new_field += data["Temperature"][sl] * data["CellMass"][sl]
weight_field += data["CellMass"][sl]
# Now some fancy footwork
new_field2 = na.zeros((nx,ny,nz))
new_field2[1:-1,1:-1,1:-1] = new_field/weight_field
return new_field2
Now, once again, we can access AveragedTemperature
just like any other field. Note that because it requires ghost zones, this will be a much slower process!
pf = load("IsolatedGalaxy/galaxy0030/galaxy0030")
dd = pf.h.all_data()
(tmin, tmax), (atmin, atmax) = dd.quantities["Extrema"](["Temperature", "AveragedTemperature"])
print tmin, tmax, atmin, atmax
print tmin / atmin, tmax / atmax
Field Parameters
Field parameters are a method of passing information to derived fields. For instance, you might pass in information about a vector you want to use as a basis for a coordinate transformation. yt often uses things like bulk_velocity
to identify velocities that should be subtracted off. Here we show how that works:
sp_small = pf.h.sphere("max", (1.0, 'kpc'))
bv = sp_small.quantities["BulkVelocity"]()
sp = pf.h.sphere("max", (0.1, 'mpc'))
rv1 = sp.quantities["Extrema"]("RadialVelocity")
sp.clear_data()
sp.set_field_parameter("bulk_velocity", bv)
rv2 = sp.quantities["Extrema"]("RadialVelocity")
print bv
print rv1
print rv2
(5)_Derived_Fields_and_Profiles.ipynb; 5)_Derived_Fields_and_Profiles_evaluated.ipynb; 5)_Derived_Fields_and_Profiles.py)