yt.frontends.fits.data_structures module¶
- class yt.frontends.fits.data_structures.EventsFITSDataset(filename=None, *args, **kwargs)[source]¶
Bases:
SkyDataFITSDataset
- add_deposited_particle_field(deposit_field, method, kernel_name='cubic', weight_field=None)¶
Add a new deposited particle field
Creates a new deposited field based on the particle deposit_field.
- Parameters:
deposit_field (tuple) – The field name tuple of the particle field the deposited field will be created from. This must be a field name tuple so yt can appropriately infer the correct particle type.
method (string) – This is the “method name” which will be looked up in the particle_deposit namespace as methodname_deposit. Current methods include simple_smooth, sum, std, cic, weighted_mean, nearest and count.
kernel_name (string, default 'cubic') – This is the name of the smoothing kernel to use. It is only used for the simple_smooth method and is otherwise ignored. Current supported kernel names include cubic, quartic, quintic, wendland2, wendland4, and wendland6.
weight_field ((field_type, field_name) or None) – Weighting field name for deposition method weighted_mean. If None, use the particle mass.
- Return type:
The field name tuple for the newly created field.
- add_field(name, function, sampling_type, *, force_override=False, **kwargs)¶
Dataset-specific call to add_field
Add a new field, along with supplemental metadata, to the list of available fields. This respects a number of arguments, all of which are passed on to the constructor for
DerivedField
.- Parameters:
name (str) – is the name of the field.
function (callable) – A function handle that defines the field. Should accept arguments (field, data)
sampling_type (str) – “cell” or “particle” or “local”
force_override (bool) – If False (default), an error will be raised if a field of the same name already exists.
units (str) – A plain text string encoding the unit. Powers must be in python syntax (** instead of ^).
take_log (bool) – Describes whether the field should be logged
validators (list) – A list of
FieldValidator
objectsvector_field (bool) – Describes the dimensionality of the field. Currently unused.
display_name (str) – A name used in the plots
force_override – Whether to override an existing derived field. Does not work with on-disk fields.
- add_gradient_fields(fields=None)¶
Add gradient fields.
Creates four new grid-based fields that represent the components of the gradient of an existing field, plus an extra field for the magnitude of the gradient. The gradient is computed using second-order centered differences.
- Parameters:
fields (str or tuple(str, str), or a list of the previous) – Label(s) for at least one field. Can either represent a tuple (<field type>, <field fname>) or simply the field name. Warning: several field types may match the provided field name, in which case the first one discovered internally is used.
- Return type:
A list of field name tuples for the newly created fields.
- Raises:
YTFieldNotParsable – If fields are not parsable to yt field keys.
YTFieldNotFound : – If at least one field can not be identified.
Examples
>>> grad_fields = ds.add_gradient_fields(("gas", "density")) >>> print(grad_fields) ... [ ... ("gas", "density_gradient_x"), ... ("gas", "density_gradient_y"), ... ("gas", "density_gradient_z"), ... ("gas", "density_gradient_magnitude"), ... ]
Note that the above example assumes ds.geometry == ‘cartesian’. In general, the function will create gradient components along the axes of the dataset coordinate system. For instance, with cylindrical data, one gets ‘density_gradient_<r,theta,z>’
- add_mesh_sampling_particle_field(sample_field, ptype='all')¶
Add a new mesh sampling particle field
Creates a new particle field which has the value of the deposit_field at the location of each particle of type ptype.
- Parameters:
sample_field (tuple) – The field name tuple of the mesh field to be deposited onto the particles. This must be a field name tuple so yt can appropriately infer the correct particle type.
ptype (string, default 'all') – The particle type onto which the deposition will occur.
- Return type:
The field name tuple for the newly created field.
Examples
>>> ds = yt.load("output_00080/info_00080.txt") ... ds.add_mesh_sampling_particle_field(("gas", "density"), ptype="all")
>>> print("The density at the location of the particle is:") ... print(ds.r["all", "cell_gas_density"]) The density at the location of the particle is: [9.33886124e-30 1.22174333e-28 1.20402333e-28 ... 2.77410331e-30 8.79467609e-31 3.50665136e-30] g/cm**3
>>> len(ds.r["all", "cell_gas_density"]) == len(ds.r["all", "particle_ones"]) True
- add_particle_filter(filter)¶
Add particle filter to the dataset.
Add
filter
to the dataset and set up relevant derived_field. It will also add anyfiltered_type
that thefilter
depends on.
- add_particle_union(union)¶
- all_data(find_max=False, **kwargs)¶
all_data is a wrapper to the Region object for creating a region which covers the entire simulation domain.
- property arr¶
Converts an array into a
yt.units.yt_array.YTArray
The returned YTArray will be dimensionless by default, but can be cast to arbitrary units using the
units
keyword argument.- Parameters:
input_array (Iterable) – A tuple, list, or array to attach units to
units (String unit specification, unit symbol or astropy object) – The units of the array. Powers must be specified using python syntax (cm**3, not cm^3).
input_units (Deprecated in favor of 'units')
dtype (string or NumPy dtype object) – The dtype of the returned array data
Examples
>>> import yt >>> import numpy as np >>> ds = yt.load("IsolatedGalaxy/galaxy0030/galaxy0030") >>> a = ds.arr([1, 2, 3], "cm") >>> b = ds.arr([4, 5, 6], "m") >>> a + b YTArray([ 401., 502., 603.]) cm >>> b + a YTArray([ 4.01, 5.02, 6.03]) m
Arrays returned by this function know about the dataset’s unit system
>>> a = ds.arr(np.ones(5), "code_length") >>> a.in_units("Mpccm/h") YTArray([ 1.00010449, 1.00010449, 1.00010449, 1.00010449, 1.00010449]) Mpc
- property backup_filename¶
- property basename¶
- box(left_edge, right_edge, **kwargs)¶
box is a wrapper to the Region object for creating a region without having to specify a center value. It assumes the center is the midpoint between the left_edge and right_edge.
Keyword arguments are passed to the initializer of the YTRegion object (e.g. ds.region).
- property checksum¶
Computes md5 sum of a dataset.
Note: Currently this property is unable to determine a complete set of files that are a part of a given dataset. As a first approximation, the checksum of
parameter_file
is calculated. In caseparameter_file
is a directory, checksum of all files inside the directory is calculated.
- close()¶
- coordinates = None¶
- create_field_info()¶
- default_field = ('gas', 'density')¶
- default_fluid_type = 'gas'¶
- default_units = {'length_unit': 'cm', 'magnetic_unit': 'gauss', 'mass_unit': 'g', 'temperature_unit': 'K', 'time_unit': 's', 'velocity_unit': 'cm/s'}¶
- define_unit(symbol, value, tex_repr=None, offset=None, prefixable=False)¶
Define a new unit and add it to the dataset’s unit registry.
- Parameters:
symbol (string) – The symbol for the new unit.
value (tuple or YTQuantity) – The definition of the new unit in terms of some other units. For example, one would define a new “mph” unit with (1.0, “mile/hr”)
tex_repr (string, optional) – The LaTeX representation of the new unit. If one is not supplied, it will be generated automatically based on the symbol string.
offset (float, optional) – The default offset for the unit. If not set, an offset of 0 is assumed.
prefixable (bool, optional) – Whether or not the new unit can use SI prefixes. Default: False
Examples
>>> ds.define_unit("mph", (1.0, "mile/hr")) >>> two_weeks = YTQuantity(14.0, "days") >>> ds.define_unit("fortnight", two_weeks)
- property derived_field_list¶
- property directory¶
- domain_offset = array([0, 0, 0])¶
- property field_list¶
- property fields¶
- fields_detected = False¶
- find_field_values_at_point(fields, coords)¶
Returns the values [field1, field2,…] of the fields at the given coordinates. Returns a list of field values in the same order as the input fields.
- find_field_values_at_points(fields, coords)¶
Returns the values [field1, field2,…] of the fields at the given [(x1, y1, z2), (x2, y2, z2),…] points. Returns a list of field values in the same order as the input fields.
- find_max(field, source=None, to_array=True)¶
Returns (value, location) of the maximum of a given field.
This is a wrapper around _find_extremum
- find_min(field, source=None, to_array=True)¶
Returns (value, location) for the minimum of a given field.
This is a wrapper around _find_extremum
- force_periodicity(val=True)¶
Override box periodicity to (True, True, True). Use ds.force_periodicty(False) to use the actual box periodicity.
- property fullpath¶
- get_smallest_appropriate_unit(v, quantity='distance', return_quantity=False)¶
Returns the largest whole unit smaller than the YTQuantity passed to it as a string.
The quantity keyword can be equal to distance or time. In the case of distance, the units are: ‘Mpc’, ‘kpc’, ‘pc’, ‘au’, ‘rsun’, ‘km’, etc. For time, the units are: ‘Myr’, ‘kyr’, ‘yr’, ‘day’, ‘hr’, ‘s’, ‘ms’, etc.
If return_quantity is set to True, it finds the largest YTQuantity object with a whole unit and a power of ten as the coefficient, and it returns this YTQuantity.
- get_unit_from_registry(unit_str)¶
Creates a unit object matching the string expression, using this dataset’s unit registry.
- Parameters:
unit_str (str) – string that we can parse for a sympy Expr.
- has_key(key)¶
Checks units, parameters, and conversion factors. Returns a boolean.
- property index¶
- property ires_factor¶
- known_filters: dict[ParticleType, ParticleFilter] | None = None¶
- property max_level¶
- property min_level¶
- property parameter_filename¶
- property particle_fields_by_type¶
- property particle_type_counts¶
- particle_unions: dict[ParticleType, ParticleUnion] | None = None¶
- property particles_exist¶
- property periodicity¶
- print_key_parameters()¶
- print_stats()¶
- property quan¶
Converts an scalar into a
yt.units.yt_array.YTQuantity
The returned YTQuantity will be dimensionless by default, but can be cast to arbitrary units using the
units
keyword argument.- Parameters:
input_scalar (an integer or floating point scalar) – The scalar to attach units to
units (String unit specification, unit symbol or astropy object) – The units of the quantity. Powers must be specified using python syntax (cm**3, not cm^3).
input_units (Deprecated in favor of 'units')
dtype (string or NumPy dtype object) – The dtype of the array data.
Examples
>>> import yt >>> ds = yt.load("IsolatedGalaxy/galaxy0030/galaxy0030")
>>> a = ds.quan(1, "cm") >>> b = ds.quan(2, "m") >>> a + b 201.0 cm >>> b + a 2.01 m
Quantities created this way automatically know about the unit system of the dataset.
>>> a = ds.quan(5, "code_length") >>> a.in_cgs() 1.543e+25 cm
- relative_refinement(l0, l1)¶
- set_code_units()¶
- set_field_label_format(format_property, value)¶
Set format properties for how fields will be written out. Accepts
format_property : string indicating what property to set value: the value to set for that format_property
- set_units()¶
Creates the unit registry for this dataset.
- setup_cosmology()¶
If this dataset is cosmological, add a cosmology object.
- setup_deprecated_fields()¶
- storage_filename = None¶
- property units¶
- class yt.frontends.fits.data_structures.EventsFITSHierarchy(ds, dataset_type='fits')[source]¶
Bases:
FITSHierarchy
- clear_all_data()¶
This routine clears all the data currently being held onto by the grids and the data io handler.
- comm = None¶
- convert(unit)¶
- float_type = 'float64'¶
- get_data(node, name)¶
Return the dataset with a given name located at node in the datafile.
- get_dependencies(fields)¶
- get_levels()¶
- get_smallest_dx()¶
Returns (in code units) the smallest cell size in the simulation.
- property grid_corners¶
- lock_grids_to_parents()¶
This function locks grid edges to their parents.
This is useful in cases where the grid structure may be somewhat irregular, or where setting the left and right edges is a lossy process. It is designed to correct situations where left/right edges may be set slightly incorrectly, resulting in discontinuities in images and the like.
- property parameters¶
- partition_index_2d(axis)¶
- partition_index_3d(ds, padding=0.0, rank_ratio=1)¶
- partition_index_3d_bisection_list()¶
Returns an array that is used to drive _partition_index_3d_bisection, below.
- partition_region_3d(left_edge, right_edge, padding=0.0, rank_ratio=1)¶
Given a region, it subdivides it into smaller regions for parallel analysis.
- print_stats()¶
Prints out (stdout) relevant information about the simulation
- save_data(array, node, name, set_attr=None, force=False, passthrough=False)¶
Arbitrary numpy data will be saved to the region in the datafile described by node and name. If data file does not exist, it throws no error and simply does not save.
- select_grids(level)¶
Returns an array of grids at level.
- class yt.frontends.fits.data_structures.FITSDataset(filename=None, *args, **kwargs)[source]¶
Bases:
Dataset
- add_deposited_particle_field(deposit_field, method, kernel_name='cubic', weight_field=None)¶
Add a new deposited particle field
Creates a new deposited field based on the particle deposit_field.
- Parameters:
deposit_field (tuple) – The field name tuple of the particle field the deposited field will be created from. This must be a field name tuple so yt can appropriately infer the correct particle type.
method (string) – This is the “method name” which will be looked up in the particle_deposit namespace as methodname_deposit. Current methods include simple_smooth, sum, std, cic, weighted_mean, nearest and count.
kernel_name (string, default 'cubic') – This is the name of the smoothing kernel to use. It is only used for the simple_smooth method and is otherwise ignored. Current supported kernel names include cubic, quartic, quintic, wendland2, wendland4, and wendland6.
weight_field ((field_type, field_name) or None) – Weighting field name for deposition method weighted_mean. If None, use the particle mass.
- Return type:
The field name tuple for the newly created field.
- add_field(name, function, sampling_type, *, force_override=False, **kwargs)¶
Dataset-specific call to add_field
Add a new field, along with supplemental metadata, to the list of available fields. This respects a number of arguments, all of which are passed on to the constructor for
DerivedField
.- Parameters:
name (str) – is the name of the field.
function (callable) – A function handle that defines the field. Should accept arguments (field, data)
sampling_type (str) – “cell” or “particle” or “local”
force_override (bool) – If False (default), an error will be raised if a field of the same name already exists.
units (str) – A plain text string encoding the unit. Powers must be in python syntax (** instead of ^).
take_log (bool) – Describes whether the field should be logged
validators (list) – A list of
FieldValidator
objectsvector_field (bool) – Describes the dimensionality of the field. Currently unused.
display_name (str) – A name used in the plots
force_override – Whether to override an existing derived field. Does not work with on-disk fields.
- add_gradient_fields(fields=None)¶
Add gradient fields.
Creates four new grid-based fields that represent the components of the gradient of an existing field, plus an extra field for the magnitude of the gradient. The gradient is computed using second-order centered differences.
- Parameters:
fields (str or tuple(str, str), or a list of the previous) – Label(s) for at least one field. Can either represent a tuple (<field type>, <field fname>) or simply the field name. Warning: several field types may match the provided field name, in which case the first one discovered internally is used.
- Return type:
A list of field name tuples for the newly created fields.
- Raises:
YTFieldNotParsable – If fields are not parsable to yt field keys.
YTFieldNotFound : – If at least one field can not be identified.
Examples
>>> grad_fields = ds.add_gradient_fields(("gas", "density")) >>> print(grad_fields) ... [ ... ("gas", "density_gradient_x"), ... ("gas", "density_gradient_y"), ... ("gas", "density_gradient_z"), ... ("gas", "density_gradient_magnitude"), ... ]
Note that the above example assumes ds.geometry == ‘cartesian’. In general, the function will create gradient components along the axes of the dataset coordinate system. For instance, with cylindrical data, one gets ‘density_gradient_<r,theta,z>’
- add_mesh_sampling_particle_field(sample_field, ptype='all')¶
Add a new mesh sampling particle field
Creates a new particle field which has the value of the deposit_field at the location of each particle of type ptype.
- Parameters:
sample_field (tuple) – The field name tuple of the mesh field to be deposited onto the particles. This must be a field name tuple so yt can appropriately infer the correct particle type.
ptype (string, default 'all') – The particle type onto which the deposition will occur.
- Return type:
The field name tuple for the newly created field.
Examples
>>> ds = yt.load("output_00080/info_00080.txt") ... ds.add_mesh_sampling_particle_field(("gas", "density"), ptype="all")
>>> print("The density at the location of the particle is:") ... print(ds.r["all", "cell_gas_density"]) The density at the location of the particle is: [9.33886124e-30 1.22174333e-28 1.20402333e-28 ... 2.77410331e-30 8.79467609e-31 3.50665136e-30] g/cm**3
>>> len(ds.r["all", "cell_gas_density"]) == len(ds.r["all", "particle_ones"]) True
- add_particle_filter(filter)¶
Add particle filter to the dataset.
Add
filter
to the dataset and set up relevant derived_field. It will also add anyfiltered_type
that thefilter
depends on.
- add_particle_union(union)¶
- all_data(find_max=False, **kwargs)¶
all_data is a wrapper to the Region object for creating a region which covers the entire simulation domain.
- property arr¶
Converts an array into a
yt.units.yt_array.YTArray
The returned YTArray will be dimensionless by default, but can be cast to arbitrary units using the
units
keyword argument.- Parameters:
input_array (Iterable) – A tuple, list, or array to attach units to
units (String unit specification, unit symbol or astropy object) – The units of the array. Powers must be specified using python syntax (cm**3, not cm^3).
input_units (Deprecated in favor of 'units')
dtype (string or NumPy dtype object) – The dtype of the returned array data
Examples
>>> import yt >>> import numpy as np >>> ds = yt.load("IsolatedGalaxy/galaxy0030/galaxy0030") >>> a = ds.arr([1, 2, 3], "cm") >>> b = ds.arr([4, 5, 6], "m") >>> a + b YTArray([ 401., 502., 603.]) cm >>> b + a YTArray([ 4.01, 5.02, 6.03]) m
Arrays returned by this function know about the dataset’s unit system
>>> a = ds.arr(np.ones(5), "code_length") >>> a.in_units("Mpccm/h") YTArray([ 1.00010449, 1.00010449, 1.00010449, 1.00010449, 1.00010449]) Mpc
- property backup_filename¶
- property basename¶
- box(left_edge, right_edge, **kwargs)¶
box is a wrapper to the Region object for creating a region without having to specify a center value. It assumes the center is the midpoint between the left_edge and right_edge.
Keyword arguments are passed to the initializer of the YTRegion object (e.g. ds.region).
- property checksum¶
Computes md5 sum of a dataset.
Note: Currently this property is unable to determine a complete set of files that are a part of a given dataset. As a first approximation, the checksum of
parameter_file
is calculated. In caseparameter_file
is a directory, checksum of all files inside the directory is calculated.
- coordinates = None¶
- create_field_info()¶
- default_field = ('gas', 'density')¶
- default_fluid_type = 'gas'¶
- default_units = {'length_unit': 'cm', 'magnetic_unit': 'gauss', 'mass_unit': 'g', 'temperature_unit': 'K', 'time_unit': 's', 'velocity_unit': 'cm/s'}¶
- define_unit(symbol, value, tex_repr=None, offset=None, prefixable=False)¶
Define a new unit and add it to the dataset’s unit registry.
- Parameters:
symbol (string) – The symbol for the new unit.
value (tuple or YTQuantity) – The definition of the new unit in terms of some other units. For example, one would define a new “mph” unit with (1.0, “mile/hr”)
tex_repr (string, optional) – The LaTeX representation of the new unit. If one is not supplied, it will be generated automatically based on the symbol string.
offset (float, optional) – The default offset for the unit. If not set, an offset of 0 is assumed.
prefixable (bool, optional) – Whether or not the new unit can use SI prefixes. Default: False
Examples
>>> ds.define_unit("mph", (1.0, "mile/hr")) >>> two_weeks = YTQuantity(14.0, "days") >>> ds.define_unit("fortnight", two_weeks)
- property derived_field_list¶
- property directory¶
- domain_offset = array([0, 0, 0])¶
- property field_list¶
- property fields¶
- fields_detected = False¶
- find_field_values_at_point(fields, coords)¶
Returns the values [field1, field2,…] of the fields at the given coordinates. Returns a list of field values in the same order as the input fields.
- find_field_values_at_points(fields, coords)¶
Returns the values [field1, field2,…] of the fields at the given [(x1, y1, z2), (x2, y2, z2),…] points. Returns a list of field values in the same order as the input fields.
- find_max(field, source=None, to_array=True)¶
Returns (value, location) of the maximum of a given field.
This is a wrapper around _find_extremum
- find_min(field, source=None, to_array=True)¶
Returns (value, location) for the minimum of a given field.
This is a wrapper around _find_extremum
- force_periodicity(val=True)¶
Override box periodicity to (True, True, True). Use ds.force_periodicty(False) to use the actual box periodicity.
- property fullpath¶
- get_smallest_appropriate_unit(v, quantity='distance', return_quantity=False)¶
Returns the largest whole unit smaller than the YTQuantity passed to it as a string.
The quantity keyword can be equal to distance or time. In the case of distance, the units are: ‘Mpc’, ‘kpc’, ‘pc’, ‘au’, ‘rsun’, ‘km’, etc. For time, the units are: ‘Myr’, ‘kyr’, ‘yr’, ‘day’, ‘hr’, ‘s’, ‘ms’, etc.
If return_quantity is set to True, it finds the largest YTQuantity object with a whole unit and a power of ten as the coefficient, and it returns this YTQuantity.
- get_unit_from_registry(unit_str)¶
Creates a unit object matching the string expression, using this dataset’s unit registry.
- Parameters:
unit_str (str) – string that we can parse for a sympy Expr.
- has_key(key)¶
Checks units, parameters, and conversion factors. Returns a boolean.
- property index¶
- property ires_factor¶
- known_filters: dict[ParticleType, ParticleFilter] | None = None¶
- property max_level¶
- property min_level¶
- property parameter_filename¶
- property particle_fields_by_type¶
- property particle_type_counts¶
- particle_unions: dict[ParticleType, ParticleUnion] | None = None¶
- property particles_exist¶
- property periodicity¶
- print_key_parameters()¶
- print_stats()¶
- property quan¶
Converts an scalar into a
yt.units.yt_array.YTQuantity
The returned YTQuantity will be dimensionless by default, but can be cast to arbitrary units using the
units
keyword argument.- Parameters:
input_scalar (an integer or floating point scalar) – The scalar to attach units to
units (String unit specification, unit symbol or astropy object) – The units of the quantity. Powers must be specified using python syntax (cm**3, not cm^3).
input_units (Deprecated in favor of 'units')
dtype (string or NumPy dtype object) – The dtype of the array data.
Examples
>>> import yt >>> ds = yt.load("IsolatedGalaxy/galaxy0030/galaxy0030")
>>> a = ds.quan(1, "cm") >>> b = ds.quan(2, "m") >>> a + b 201.0 cm >>> b + a 2.01 m
Quantities created this way automatically know about the unit system of the dataset.
>>> a = ds.quan(5, "code_length") >>> a.in_cgs() 1.543e+25 cm
- relative_refinement(l0, l1)¶
- set_code_units()¶
- set_field_label_format(format_property, value)¶
Set format properties for how fields will be written out. Accepts
format_property : string indicating what property to set value: the value to set for that format_property
- set_units()¶
Creates the unit registry for this dataset.
- setup_cosmology()¶
If this dataset is cosmological, add a cosmology object.
- setup_deprecated_fields()¶
- storage_filename = None¶
- property units¶
- class yt.frontends.fits.data_structures.FITSGrid(id, index, level)[source]¶
Bases:
AMRGridPatch
- OverlappingSiblings = None¶
- apply_units(arr, units)¶
- argmax(field, axis=None)¶
Return the values at which the field is maximized.
This will, in a parallel-aware fashion, find the maximum value and then return to you the values at that maximum location that are requested for “axis”. By default it will return the spatial positions (in the natural coordinate system), but it can be any field
- Parameters:
field (string or tuple field name) – The field to maximize.
axis (string or list of strings, optional) – If supplied, the fields to sample along; if not supplied, defaults to the coordinate fields. This can be the name of the coordinate fields (i.e., ‘x’, ‘y’, ‘z’) or a list of fields, but cannot be 0, 1, 2.
- Return type:
A list of YTQuantities as specified by the axis argument.
Examples
>>> temp_at_max_rho = reg.argmax( ... ("gas", "density"), axis=("gas", "temperature") ... ) >>> max_rho_xyz = reg.argmax(("gas", "density")) >>> t_mrho, v_mrho = reg.argmax( ... ("gas", "density"), ... axis=[("gas", "temperature"), ("gas", "velocity_magnitude")], ... ) >>> x, y, z = reg.argmax(("gas", "density"))
- argmin(field, axis=None)¶
Return the values at which the field is minimized.
This will, in a parallel-aware fashion, find the minimum value and then return to you the values at that minimum location that are requested for “axis”. By default it will return the spatial positions (in the natural coordinate system), but it can be any field
- Parameters:
field (string or tuple field name) – The field to minimize.
axis (string or list of strings, optional) – If supplied, the fields to sample along; if not supplied, defaults to the coordinate fields. This can be the name of the coordinate fields (i.e., ‘x’, ‘y’, ‘z’) or a list of fields, but cannot be 0, 1, 2.
- Return type:
A list of YTQuantities as specified by the axis argument.
Examples
>>> temp_at_min_rho = reg.argmin( ... ("gas", "density"), axis=("gas", "temperature") ... ) >>> min_rho_xyz = reg.argmin(("gas", "density")) >>> t_mrho, v_mrho = reg.argmin( ... ("gas", "density"), ... axis=[("gas", "temperature"), ("gas", "velocity_magnitude")], ... ) >>> x, y, z = reg.argmin(("gas", "density"))
- property blocks¶
- property child_index_mask¶
Generates self.child_index_mask, which is -1 where there is no child, and otherwise has the ID of the grid that resides there.
- property child_indices¶
- property child_mask¶
Generates self.child_mask, which is zero where child grids exist (and thus, where higher resolution data is available).
- chunks(fields, chunking_style, **kwargs)¶
- clear_data()¶
Clear out the following things: child_mask, child_indices, all fields, all field parameters.
- clone()¶
Clone a data object.
This will make a duplicate of a data object; note that the field_parameters may not necessarily be deeply-copied. If you modify the field parameters in-place, it may or may not be shared between the objects, depending on the type of object that that particular field parameter is.
Notes
One use case for this is to have multiple identical data objects that are being chunked over in different orders.
Examples
>>> ds = yt.load("IsolatedGalaxy/galaxy0030/galaxy0030") >>> sp = ds.sphere("c", 0.1) >>> sp_clone = sp.clone() >>> sp["gas", "density"] >>> print(sp.field_data.keys()) [("gas", "density")] >>> print(sp_clone.field_data.keys()) []
- comm = None¶
- convert(datatype)¶
This will attempt to convert a given unit to cgs from code units. It either returns the multiplicative factor or throws a KeyError.
- count(selector)¶
- count_particles(selector, x, y, z)¶
- create_firefly_object(datadir=None, fields_to_include=None, fields_units=None, default_decimation_factor=100, velocity_units='km/s', coordinate_units='kpc', show_unused_fields=0, *, JSONdir=None, match_any_particle_types=True, **kwargs)¶
This function links a region of data stored in a yt dataset to the Python frontend API for [Firefly](http://github.com/ageller/Firefly), a browser-based particle visualization tool.
- Parameters:
datadir (string) – Path to where any .json files should be saved. If a relative path will assume relative to ${HOME}. A value of None will default to ${HOME}/Data.
fields_to_include (array_like of strings or field tuples) – A list of fields that you want to include in your Firefly visualization for on-the-fly filtering and colormapping.
default_decimation_factor (integer) – The factor by which you want to decimate each particle group by (e.g. if there are 1e7 total particles in your simulation you might want to set this to 100 at first). Randomly samples your data like shuffled_data[::decimation_factor] so as to not overtax a system. This is adjustable on a per particle group basis by changing the returned reader’s reader.particleGroup[i].decimation_factor before calling reader.writeToDisk().
velocity_units (string) – The units that the velocity should be converted to in order to show streamlines in Firefly. Defaults to km/s.
coordinate_units (string) – The units that the coordinates should be converted to. Defaults to kpc.
show_unused_fields (boolean) – A flag to optionally print the fields that are available, in the dataset but were not explicitly requested to be tracked.
match_any_particle_types (boolean) – If True, when any of the fields_to_include match multiple particle groups then the field will be added for all matching particle groups. If False, an error is raised when encountering an ambiguous field. Default is True.
to (Any additional keyword arguments are passed)
firefly.data_reader.Reader.__init__
- Returns:
reader – A reader object from the Firefly, configured to output the current region selected
- Return type:
Firefly.data_reader.Reader object
Examples
>>> ramses_ds = yt.load( ... "/Users/agurvich/Desktop/yt_workshop/" ... + "DICEGalaxyDisk_nonCosmological/output_00002/info_00002.txt" ... )
>>> region = ramses_ds.sphere(ramses_ds.domain_center, (1000, "kpc"))
>>> reader = region.create_firefly_object( ... "IsoGalaxyRamses", ... fields_to_include=[ ... "particle_extra_field_1", ... "particle_extra_field_2", ... ], ... fields_units=["dimensionless", "dimensionless"], ... )
>>> reader.settings["color"]["io"] = [1, 1, 0, 1] >>> reader.particleGroups[0].decimation_factor = 100 >>> reader.writeToDisk()
- deposit(positions, fields=None, method=None, kernel_name='cubic')¶
- property fcoords¶
- property fcoords_vertex¶
- property fwidth¶
- get_data(fields=None)¶
- get_dependencies(fields)¶
- get_field_parameter(name, default=None)¶
This is typically only used by derived field functions, but it returns parameters used to generate fields.
- get_global_startindex()¶
Return the integer starting index for each dimension at the current level.
- get_position(index)¶
Returns center position of an index.
- get_vertex_centered_data(fields: list[tuple[str, str]], smoothed: bool = True, no_ghost: bool = False)¶
- has_field_parameter(name)¶
Checks if a field parameter is set.
- has_key(key)¶
Checks if a data field already exists.
- property icoords¶
- property index¶
- integrate(field, weight=None, axis=None, *, moment=1)¶
Compute the integral (projection) of a field along an axis.
This projects a field along an axis.
- Parameters:
field (string or tuple field name) – The field to project.
weight (string or tuple field name) – The field to weight the projection by
axis (string) – The axis to project along.
moment (integer, optional) – for a weighted projection, moment = 1 (the default) corresponds to a weighted average. moment = 2 corresponds to a weighted standard deviation.
- Return type:
YTProjection
Examples
>>> column_density = reg.integrate(("gas", "density"), axis=("index", "z"))
- property ires¶
- keys()¶
- max(field, axis=None)¶
Compute the maximum of a field, optionally along an axis.
This will, in a parallel-aware fashion, compute the maximum of the given field. Supplying an axis will result in a return value of a YTProjection, with method ‘max’ for maximum intensity. If the max has already been requested, it will use the cached extrema value.
- Parameters:
field (string or tuple field name) – The field to maximize.
axis (string, optional) – If supplied, the axis to project the maximum along.
- Return type:
Either a scalar or a YTProjection.
Examples
>>> max_temp = reg.max(("gas", "temperature")) >>> max_temp_proj = reg.max(("gas", "temperature"), axis=("index", "x"))
- property max_level¶
- mean(field, axis=None, weight=None)¶
Compute the mean of a field, optionally along an axis, with a weight.
This will, in a parallel-aware fashion, compute the mean of the given field. If an axis is supplied, it will return a projection, where the weight is also supplied. By default the weight field will be “ones” or “particle_ones”, depending on the field being averaged, resulting in an unweighted average.
- Parameters:
field (string or tuple field name) – The field to average.
axis (string, optional) – If supplied, the axis to compute the mean along (i.e., to project along)
weight (string, optional) – The field to use as a weight.
- Return type:
Scalar or YTProjection.
Examples
>>> avg_rho = reg.mean(("gas", "density"), weight="cell_volume") >>> rho_weighted_T = reg.mean( ... ("gas", "temperature"), axis=("index", "y"), weight=("gas", "density") ... )
- min(field, axis=None)¶
Compute the minimum of a field.
This will, in a parallel-aware fashion, compute the minimum of the given field. Supplying an axis will result in a return value of a YTProjection, with method ‘min’ for minimum intensity. If the min has already been requested, it will use the cached extrema value.
- Parameters:
field (string or tuple field name) – The field to minimize.
axis (string, optional) – If supplied, the axis to compute the minimum along.
- Return type:
Either a scalar or a YTProjection.
Examples
>>> min_temp = reg.min(("gas", "temperature")) >>> min_temp_proj = reg.min(("gas", "temperature"), axis=("index", "x"))
- property min_level¶
- particle_operation(*args, **kwargs)¶
- partition_index_2d(axis)¶
- partition_index_3d(ds, padding=0.0, rank_ratio=1)¶
- partition_index_3d_bisection_list()¶
Returns an array that is used to drive _partition_index_3d_bisection, below.
- partition_region_3d(left_edge, right_edge, padding=0.0, rank_ratio=1)¶
Given a region, it subdivides it into smaller regions for parallel analysis.
- property pf¶
- profile(bin_fields, fields, n_bins=64, extrema=None, logs=None, units=None, weight_field=('gas', 'mass'), accumulation=False, fractional=False, deposition='ngp', *, override_bins=None)¶
Create a 1, 2, or 3D profile object from this data_source.
The dimensionality of the profile object is chosen by the number of fields given in the bin_fields argument. This simply calls
yt.data_objects.profiles.create_profile()
.- Parameters:
bin_fields (list of strings) – List of the binning fields for profiling.
fields (list of strings) – The fields to be profiled.
n_bins (int or list of ints) – The number of bins in each dimension. If None, 64 bins for each bin are used for each bin field. Default: 64.
extrema (dict of min, max tuples) – Minimum and maximum values of the bin_fields for the profiles. The keys correspond to the field names. Defaults to the extrema of the bin_fields of the dataset. If a units dict is provided, extrema are understood to be in the units specified in the dictionary.
logs (dict of boolean values) – Whether or not to log the bin_fields for the profiles. The keys correspond to the field names. Defaults to the take_log attribute of the field.
units (dict of strings) – The units of the fields in the profiles, including the bin_fields.
weight_field (str or tuple field identifier) – The weight field for computing weighted average for the profile values. If None, the profile values are sums of the data in each bin.
accumulation (bool or list of bools) – If True, the profile values for a bin n are the cumulative sum of all the values from bin 0 to n. If -True, the sum is reversed so that the value for bin n is the cumulative sum from bin N (total bins) to n. If the profile is 2D or 3D, a list of values can be given to control the summation in each dimension independently. Default: False.
fractional (If True the profile values are divided by the sum of all) – the profile data such that the profile represents a probability distribution function.
deposition (Controls the type of deposition used for ParticlePhasePlots.) – Valid choices are ‘ngp’ and ‘cic’. Default is ‘ngp’. This parameter is ignored if the input fields are not of particle type.
override_bins (dict of bins to profile plot with) – If set, ignores n_bins and extrema settings and uses the supplied bins to profile the field. If a units dict is provided, bins are understood to be in the units specified in the dictionary.
Examples
Create a 1d profile. Access bin field from profile.x and field data from profile[<field_name>].
>>> ds = load("DD0046/DD0046") >>> ad = ds.all_data() >>> profile = ad.profile( ... ad, ... [("gas", "density")], ... [("gas", "temperature"), ("gas", "velocity_x")], ... ) >>> print(profile.x) >>> print(profile["gas", "temperature"]) >>> plot = profile.plot()
- ptp(field)¶
Compute the range of values (maximum - minimum) of a field.
This will, in a parallel-aware fashion, compute the “peak-to-peak” of the given field.
- Parameters:
field (string or tuple field name) – The field to average.
- Return type:
Scalar
Examples
>>> rho_range = reg.ptp(("gas", "density"))
- retrieve_ghost_zones(n_zones, fields, all_levels=False, smoothed=False)¶
- save_as_dataset(filename=None, fields=None)¶
Export a data object to a reloadable yt dataset.
This function will take a data object and output a dataset containing either the fields presently existing or fields given in the
fields
list. The resulting dataset can be reloaded as a yt dataset.- Parameters:
filename (str, optional) – The name of the file to be written. If None, the name will be a combination of the original dataset and the type of data container.
fields (list of string or tuple field names, optional) – If this is supplied, it is the list of fields to be saved to disk. If not supplied, all the fields that have been queried will be saved.
- Returns:
filename – The name of the file that has been created.
- Return type:
Examples
>>> import yt >>> ds = yt.load("enzo_tiny_cosmology/DD0046/DD0046") >>> sp = ds.sphere(ds.domain_center, (10, "Mpc")) >>> fn = sp.save_as_dataset(fields=[("gas", "density"), ("gas", "temperature")]) >>> sphere_ds = yt.load(fn) >>> # the original data container is available as the data attribute >>> print(sds.data["gas", "density"]) [ 4.46237613e-32 4.86830178e-32 4.46335118e-32 ..., 6.43956165e-30 3.57339907e-30 2.83150720e-30] g/cm**3 >>> ad = sphere_ds.all_data() >>> print(ad["gas", "temperature"]) [ 1.00000000e+00 1.00000000e+00 1.00000000e+00 ..., 4.40108359e+04 4.54380547e+04 4.72560117e+04] K
- select(selector, source, dest, offset)¶
- select_blocks(selector)¶
- select_fcoords(dobj)¶
- select_fwidth(dobj)¶
- select_icoords(dobj)¶
- select_ires(dobj)¶
- select_particles(selector, x, y, z)¶
- select_tcoords(dobj)¶
- property selector¶
- set_field_parameter(name, val)¶
Here we set up dictionaries that get passed up and down and ultimately to derived fields.
- property shape¶
- smooth(*args, **kwargs)¶
- std(field, axis=None, weight=None)¶
Compute the standard deviation of a field, optionally along an axis, with a weight.
This will, in a parallel-ware fashion, compute the standard deviation of the given field. If an axis is supplied, it will return a projection, where the weight is also supplied.
By default the weight field will be “ones” or “particle_ones”, depending on the field, resulting in an unweighted standard deviation.
- Parameters:
field (string or tuple field name) – The field to calculate the standard deviation of
axis (string, optional) – If supplied, the axis to compute the standard deviation along (i.e., to project along)
weight (string, optional) – The field to use as a weight.
- Return type:
Scalar or YTProjection.
- sum(field, axis=None)¶
Compute the sum of a field, optionally along an axis.
This will, in a parallel-aware fashion, compute the sum of the given field. If an axis is specified, it will return a projection (using method type “sum”, which does not take into account path length) along that axis.
- Parameters:
field (string or tuple field name) – The field to sum.
axis (string, optional) – If supplied, the axis to sum along.
- Return type:
Either a scalar or a YTProjection.
Examples
>>> total_vol = reg.sum("cell_volume") >>> cell_count = reg.sum(("index", "ones"), axis=("index", "x"))
- property tiles¶
- to_astropy_table(fields)¶
Export region data to a :class:~astropy.table.table.QTable, which is a Table object which is unit-aware. The QTable can then be exported to an ASCII file, FITS file, etc.
See the AstroPy Table docs for more details: http://docs.astropy.org/en/stable/table/
- Parameters:
fields (list of strings or tuple field names) – This is the list of fields to be exported into the QTable.
Examples
>>> sp = ds.sphere("c", (1.0, "Mpc")) >>> t = sp.to_astropy_table([("gas", "density"), ("gas", "temperature")])
- to_dataframe(fields)¶
Export a data object to a
DataFrame
.This function will take a data object and an optional list of fields and export them to a
DataFrame
object. If pandas is not importable, this will raise ImportError.- Parameters:
fields (list of strings or tuple field names) – This is the list of fields to be exported into the DataFrame.
- Returns:
df – The data contained in the object.
- Return type:
Examples
>>> dd = ds.all_data() >>> df = dd.to_dataframe([("gas", "density"), ("gas", "temperature")])
- to_glue(fields, label='yt', data_collection=None)¶
Takes specific fields in the container and exports them to Glue (http://glueviz.org) for interactive analysis. Optionally add a label. If you are already within the Glue environment, you can pass a data_collection object, otherwise Glue will be started.
- write_out(filename, fields=None, format='%0.16e')¶
Write out the YTDataContainer object in a text file.
This function will take a data object and produce a tab delimited text file containing the fields presently existing and the fields given in the
fields
list.- Parameters:
filename (String) – The name of the file to write to.
fields (List of string, Default = None) – If this is supplied, these fields will be added to the list of fields to be saved to disk. If not supplied, whatever fields presently exist will be used.
format (String, Default = "%0.16e") – Format of numbers to be written in the file.
- Raises:
ValueError – Raised when there is no existing field.
YTException – Raised when field_type of supplied fields is inconsistent with the field_type of existing fields.
Examples
>>> ds = fake_particle_ds() >>> sp = ds.sphere(ds.domain_center, 0.25) >>> sp.write_out("sphere_1.txt") >>> sp.write_out("sphere_2.txt", fields=["cell_volume"])
- class yt.frontends.fits.data_structures.FITSHierarchy(ds, dataset_type='fits')[source]¶
Bases:
GridIndex
- clear_all_data()¶
This routine clears all the data currently being held onto by the grids and the data io handler.
- comm = None¶
- convert(unit)¶
- float_type = 'float64'¶
- get_data(node, name)¶
Return the dataset with a given name located at node in the datafile.
- get_dependencies(fields)¶
- get_levels()¶
- get_smallest_dx()¶
Returns (in code units) the smallest cell size in the simulation.
- property grid_corners¶
- lock_grids_to_parents()¶
This function locks grid edges to their parents.
This is useful in cases where the grid structure may be somewhat irregular, or where setting the left and right edges is a lossy process. It is designed to correct situations where left/right edges may be set slightly incorrectly, resulting in discontinuities in images and the like.
- property parameters¶
- partition_index_2d(axis)¶
- partition_index_3d(ds, padding=0.0, rank_ratio=1)¶
- partition_index_3d_bisection_list()¶
Returns an array that is used to drive _partition_index_3d_bisection, below.
- partition_region_3d(left_edge, right_edge, padding=0.0, rank_ratio=1)¶
Given a region, it subdivides it into smaller regions for parallel analysis.
- print_stats()¶
Prints out (stdout) relevant information about the simulation
- save_data(array, node, name, set_attr=None, force=False, passthrough=False)¶
Arbitrary numpy data will be saved to the region in the datafile described by node and name. If data file does not exist, it throws no error and simply does not save.
- select_grids(level)¶
Returns an array of grids at level.
- class yt.frontends.fits.data_structures.SkyDataFITSDataset(filename=None, *args, **kwargs)[source]¶
Bases:
FITSDataset
- add_deposited_particle_field(deposit_field, method, kernel_name='cubic', weight_field=None)¶
Add a new deposited particle field
Creates a new deposited field based on the particle deposit_field.
- Parameters:
deposit_field (tuple) – The field name tuple of the particle field the deposited field will be created from. This must be a field name tuple so yt can appropriately infer the correct particle type.
method (string) – This is the “method name” which will be looked up in the particle_deposit namespace as methodname_deposit. Current methods include simple_smooth, sum, std, cic, weighted_mean, nearest and count.
kernel_name (string, default 'cubic') – This is the name of the smoothing kernel to use. It is only used for the simple_smooth method and is otherwise ignored. Current supported kernel names include cubic, quartic, quintic, wendland2, wendland4, and wendland6.
weight_field ((field_type, field_name) or None) – Weighting field name for deposition method weighted_mean. If None, use the particle mass.
- Return type:
The field name tuple for the newly created field.
- add_field(name, function, sampling_type, *, force_override=False, **kwargs)¶
Dataset-specific call to add_field
Add a new field, along with supplemental metadata, to the list of available fields. This respects a number of arguments, all of which are passed on to the constructor for
DerivedField
.- Parameters:
name (str) – is the name of the field.
function (callable) – A function handle that defines the field. Should accept arguments (field, data)
sampling_type (str) – “cell” or “particle” or “local”
force_override (bool) – If False (default), an error will be raised if a field of the same name already exists.
units (str) – A plain text string encoding the unit. Powers must be in python syntax (** instead of ^).
take_log (bool) – Describes whether the field should be logged
validators (list) – A list of
FieldValidator
objectsvector_field (bool) – Describes the dimensionality of the field. Currently unused.
display_name (str) – A name used in the plots
force_override – Whether to override an existing derived field. Does not work with on-disk fields.
- add_gradient_fields(fields=None)¶
Add gradient fields.
Creates four new grid-based fields that represent the components of the gradient of an existing field, plus an extra field for the magnitude of the gradient. The gradient is computed using second-order centered differences.
- Parameters:
fields (str or tuple(str, str), or a list of the previous) – Label(s) for at least one field. Can either represent a tuple (<field type>, <field fname>) or simply the field name. Warning: several field types may match the provided field name, in which case the first one discovered internally is used.
- Return type:
A list of field name tuples for the newly created fields.
- Raises:
YTFieldNotParsable – If fields are not parsable to yt field keys.
YTFieldNotFound : – If at least one field can not be identified.
Examples
>>> grad_fields = ds.add_gradient_fields(("gas", "density")) >>> print(grad_fields) ... [ ... ("gas", "density_gradient_x"), ... ("gas", "density_gradient_y"), ... ("gas", "density_gradient_z"), ... ("gas", "density_gradient_magnitude"), ... ]
Note that the above example assumes ds.geometry == ‘cartesian’. In general, the function will create gradient components along the axes of the dataset coordinate system. For instance, with cylindrical data, one gets ‘density_gradient_<r,theta,z>’
- add_mesh_sampling_particle_field(sample_field, ptype='all')¶
Add a new mesh sampling particle field
Creates a new particle field which has the value of the deposit_field at the location of each particle of type ptype.
- Parameters:
sample_field (tuple) – The field name tuple of the mesh field to be deposited onto the particles. This must be a field name tuple so yt can appropriately infer the correct particle type.
ptype (string, default 'all') – The particle type onto which the deposition will occur.
- Return type:
The field name tuple for the newly created field.
Examples
>>> ds = yt.load("output_00080/info_00080.txt") ... ds.add_mesh_sampling_particle_field(("gas", "density"), ptype="all")
>>> print("The density at the location of the particle is:") ... print(ds.r["all", "cell_gas_density"]) The density at the location of the particle is: [9.33886124e-30 1.22174333e-28 1.20402333e-28 ... 2.77410331e-30 8.79467609e-31 3.50665136e-30] g/cm**3
>>> len(ds.r["all", "cell_gas_density"]) == len(ds.r["all", "particle_ones"]) True
- add_particle_filter(filter)¶
Add particle filter to the dataset.
Add
filter
to the dataset and set up relevant derived_field. It will also add anyfiltered_type
that thefilter
depends on.
- add_particle_union(union)¶
- all_data(find_max=False, **kwargs)¶
all_data is a wrapper to the Region object for creating a region which covers the entire simulation domain.
- property arr¶
Converts an array into a
yt.units.yt_array.YTArray
The returned YTArray will be dimensionless by default, but can be cast to arbitrary units using the
units
keyword argument.- Parameters:
input_array (Iterable) – A tuple, list, or array to attach units to
units (String unit specification, unit symbol or astropy object) – The units of the array. Powers must be specified using python syntax (cm**3, not cm^3).
input_units (Deprecated in favor of 'units')
dtype (string or NumPy dtype object) – The dtype of the returned array data
Examples
>>> import yt >>> import numpy as np >>> ds = yt.load("IsolatedGalaxy/galaxy0030/galaxy0030") >>> a = ds.arr([1, 2, 3], "cm") >>> b = ds.arr([4, 5, 6], "m") >>> a + b YTArray([ 401., 502., 603.]) cm >>> b + a YTArray([ 4.01, 5.02, 6.03]) m
Arrays returned by this function know about the dataset’s unit system
>>> a = ds.arr(np.ones(5), "code_length") >>> a.in_units("Mpccm/h") YTArray([ 1.00010449, 1.00010449, 1.00010449, 1.00010449, 1.00010449]) Mpc
- property backup_filename¶
- property basename¶
- box(left_edge, right_edge, **kwargs)¶
box is a wrapper to the Region object for creating a region without having to specify a center value. It assumes the center is the midpoint between the left_edge and right_edge.
Keyword arguments are passed to the initializer of the YTRegion object (e.g. ds.region).
- property checksum¶
Computes md5 sum of a dataset.
Note: Currently this property is unable to determine a complete set of files that are a part of a given dataset. As a first approximation, the checksum of
parameter_file
is calculated. In caseparameter_file
is a directory, checksum of all files inside the directory is calculated.
- close()¶
- coordinates = None¶
- create_field_info()¶
- default_field = ('gas', 'density')¶
- default_fluid_type = 'gas'¶
- default_units = {'length_unit': 'cm', 'magnetic_unit': 'gauss', 'mass_unit': 'g', 'temperature_unit': 'K', 'time_unit': 's', 'velocity_unit': 'cm/s'}¶
- define_unit(symbol, value, tex_repr=None, offset=None, prefixable=False)¶
Define a new unit and add it to the dataset’s unit registry.
- Parameters:
symbol (string) – The symbol for the new unit.
value (tuple or YTQuantity) – The definition of the new unit in terms of some other units. For example, one would define a new “mph” unit with (1.0, “mile/hr”)
tex_repr (string, optional) – The LaTeX representation of the new unit. If one is not supplied, it will be generated automatically based on the symbol string.
offset (float, optional) – The default offset for the unit. If not set, an offset of 0 is assumed.
prefixable (bool, optional) – Whether or not the new unit can use SI prefixes. Default: False
Examples
>>> ds.define_unit("mph", (1.0, "mile/hr")) >>> two_weeks = YTQuantity(14.0, "days") >>> ds.define_unit("fortnight", two_weeks)
- property derived_field_list¶
- property directory¶
- domain_offset = array([0, 0, 0])¶
- property field_list¶
- property fields¶
- fields_detected = False¶
- find_field_values_at_point(fields, coords)¶
Returns the values [field1, field2,…] of the fields at the given coordinates. Returns a list of field values in the same order as the input fields.
- find_field_values_at_points(fields, coords)¶
Returns the values [field1, field2,…] of the fields at the given [(x1, y1, z2), (x2, y2, z2),…] points. Returns a list of field values in the same order as the input fields.
- find_max(field, source=None, to_array=True)¶
Returns (value, location) of the maximum of a given field.
This is a wrapper around _find_extremum
- find_min(field, source=None, to_array=True)¶
Returns (value, location) for the minimum of a given field.
This is a wrapper around _find_extremum
- force_periodicity(val=True)¶
Override box periodicity to (True, True, True). Use ds.force_periodicty(False) to use the actual box periodicity.
- property fullpath¶
- get_smallest_appropriate_unit(v, quantity='distance', return_quantity=False)¶
Returns the largest whole unit smaller than the YTQuantity passed to it as a string.
The quantity keyword can be equal to distance or time. In the case of distance, the units are: ‘Mpc’, ‘kpc’, ‘pc’, ‘au’, ‘rsun’, ‘km’, etc. For time, the units are: ‘Myr’, ‘kyr’, ‘yr’, ‘day’, ‘hr’, ‘s’, ‘ms’, etc.
If return_quantity is set to True, it finds the largest YTQuantity object with a whole unit and a power of ten as the coefficient, and it returns this YTQuantity.
- get_unit_from_registry(unit_str)¶
Creates a unit object matching the string expression, using this dataset’s unit registry.
- Parameters:
unit_str (str) – string that we can parse for a sympy Expr.
- has_key(key)¶
Checks units, parameters, and conversion factors. Returns a boolean.
- property index¶
- property ires_factor¶
- known_filters: dict[ParticleType, ParticleFilter] | None = None¶
- property max_level¶
- property min_level¶
- property parameter_filename¶
- property particle_fields_by_type¶
- property particle_type_counts¶
- particle_unions: dict[ParticleType, ParticleUnion] | None = None¶
- property particles_exist¶
- property periodicity¶
- print_key_parameters()¶
- print_stats()¶
- property quan¶
Converts an scalar into a
yt.units.yt_array.YTQuantity
The returned YTQuantity will be dimensionless by default, but can be cast to arbitrary units using the
units
keyword argument.- Parameters:
input_scalar (an integer or floating point scalar) – The scalar to attach units to
units (String unit specification, unit symbol or astropy object) – The units of the quantity. Powers must be specified using python syntax (cm**3, not cm^3).
input_units (Deprecated in favor of 'units')
dtype (string or NumPy dtype object) – The dtype of the array data.
Examples
>>> import yt >>> ds = yt.load("IsolatedGalaxy/galaxy0030/galaxy0030")
>>> a = ds.quan(1, "cm") >>> b = ds.quan(2, "m") >>> a + b 201.0 cm >>> b + a 2.01 m
Quantities created this way automatically know about the unit system of the dataset.
>>> a = ds.quan(5, "code_length") >>> a.in_cgs() 1.543e+25 cm
- relative_refinement(l0, l1)¶
- set_code_units()¶
- set_field_label_format(format_property, value)¶
Set format properties for how fields will be written out. Accepts
format_property : string indicating what property to set value: the value to set for that format_property
- set_units()¶
Creates the unit registry for this dataset.
- setup_cosmology()¶
If this dataset is cosmological, add a cosmology object.
- setup_deprecated_fields()¶
- storage_filename = None¶
- property units¶
- class yt.frontends.fits.data_structures.SpectralCubeFITSDataset(filename=None, *args, **kwargs)[source]¶
Bases:
SkyDataFITSDataset
- add_deposited_particle_field(deposit_field, method, kernel_name='cubic', weight_field=None)¶
Add a new deposited particle field
Creates a new deposited field based on the particle deposit_field.
- Parameters:
deposit_field (tuple) – The field name tuple of the particle field the deposited field will be created from. This must be a field name tuple so yt can appropriately infer the correct particle type.
method (string) – This is the “method name” which will be looked up in the particle_deposit namespace as methodname_deposit. Current methods include simple_smooth, sum, std, cic, weighted_mean, nearest and count.
kernel_name (string, default 'cubic') – This is the name of the smoothing kernel to use. It is only used for the simple_smooth method and is otherwise ignored. Current supported kernel names include cubic, quartic, quintic, wendland2, wendland4, and wendland6.
weight_field ((field_type, field_name) or None) – Weighting field name for deposition method weighted_mean. If None, use the particle mass.
- Return type:
The field name tuple for the newly created field.
- add_field(name, function, sampling_type, *, force_override=False, **kwargs)¶
Dataset-specific call to add_field
Add a new field, along with supplemental metadata, to the list of available fields. This respects a number of arguments, all of which are passed on to the constructor for
DerivedField
.- Parameters:
name (str) – is the name of the field.
function (callable) – A function handle that defines the field. Should accept arguments (field, data)
sampling_type (str) – “cell” or “particle” or “local”
force_override (bool) – If False (default), an error will be raised if a field of the same name already exists.
units (str) – A plain text string encoding the unit. Powers must be in python syntax (** instead of ^).
take_log (bool) – Describes whether the field should be logged
validators (list) – A list of
FieldValidator
objectsvector_field (bool) – Describes the dimensionality of the field. Currently unused.
display_name (str) – A name used in the plots
force_override – Whether to override an existing derived field. Does not work with on-disk fields.
- add_gradient_fields(fields=None)¶
Add gradient fields.
Creates four new grid-based fields that represent the components of the gradient of an existing field, plus an extra field for the magnitude of the gradient. The gradient is computed using second-order centered differences.
- Parameters:
fields (str or tuple(str, str), or a list of the previous) – Label(s) for at least one field. Can either represent a tuple (<field type>, <field fname>) or simply the field name. Warning: several field types may match the provided field name, in which case the first one discovered internally is used.
- Return type:
A list of field name tuples for the newly created fields.
- Raises:
YTFieldNotParsable – If fields are not parsable to yt field keys.
YTFieldNotFound : – If at least one field can not be identified.
Examples
>>> grad_fields = ds.add_gradient_fields(("gas", "density")) >>> print(grad_fields) ... [ ... ("gas", "density_gradient_x"), ... ("gas", "density_gradient_y"), ... ("gas", "density_gradient_z"), ... ("gas", "density_gradient_magnitude"), ... ]
Note that the above example assumes ds.geometry == ‘cartesian’. In general, the function will create gradient components along the axes of the dataset coordinate system. For instance, with cylindrical data, one gets ‘density_gradient_<r,theta,z>’
- add_mesh_sampling_particle_field(sample_field, ptype='all')¶
Add a new mesh sampling particle field
Creates a new particle field which has the value of the deposit_field at the location of each particle of type ptype.
- Parameters:
sample_field (tuple) – The field name tuple of the mesh field to be deposited onto the particles. This must be a field name tuple so yt can appropriately infer the correct particle type.
ptype (string, default 'all') – The particle type onto which the deposition will occur.
- Return type:
The field name tuple for the newly created field.
Examples
>>> ds = yt.load("output_00080/info_00080.txt") ... ds.add_mesh_sampling_particle_field(("gas", "density"), ptype="all")
>>> print("The density at the location of the particle is:") ... print(ds.r["all", "cell_gas_density"]) The density at the location of the particle is: [9.33886124e-30 1.22174333e-28 1.20402333e-28 ... 2.77410331e-30 8.79467609e-31 3.50665136e-30] g/cm**3
>>> len(ds.r["all", "cell_gas_density"]) == len(ds.r["all", "particle_ones"]) True
- add_particle_filter(filter)¶
Add particle filter to the dataset.
Add
filter
to the dataset and set up relevant derived_field. It will also add anyfiltered_type
that thefilter
depends on.
- add_particle_union(union)¶
- all_data(find_max=False, **kwargs)¶
all_data is a wrapper to the Region object for creating a region which covers the entire simulation domain.
- property arr¶
Converts an array into a
yt.units.yt_array.YTArray
The returned YTArray will be dimensionless by default, but can be cast to arbitrary units using the
units
keyword argument.- Parameters:
input_array (Iterable) – A tuple, list, or array to attach units to
units (String unit specification, unit symbol or astropy object) – The units of the array. Powers must be specified using python syntax (cm**3, not cm^3).
input_units (Deprecated in favor of 'units')
dtype (string or NumPy dtype object) – The dtype of the returned array data
Examples
>>> import yt >>> import numpy as np >>> ds = yt.load("IsolatedGalaxy/galaxy0030/galaxy0030") >>> a = ds.arr([1, 2, 3], "cm") >>> b = ds.arr([4, 5, 6], "m") >>> a + b YTArray([ 401., 502., 603.]) cm >>> b + a YTArray([ 4.01, 5.02, 6.03]) m
Arrays returned by this function know about the dataset’s unit system
>>> a = ds.arr(np.ones(5), "code_length") >>> a.in_units("Mpccm/h") YTArray([ 1.00010449, 1.00010449, 1.00010449, 1.00010449, 1.00010449]) Mpc
- property backup_filename¶
- property basename¶
- box(left_edge, right_edge, **kwargs)¶
box is a wrapper to the Region object for creating a region without having to specify a center value. It assumes the center is the midpoint between the left_edge and right_edge.
Keyword arguments are passed to the initializer of the YTRegion object (e.g. ds.region).
- property checksum¶
Computes md5 sum of a dataset.
Note: Currently this property is unable to determine a complete set of files that are a part of a given dataset. As a first approximation, the checksum of
parameter_file
is calculated. In caseparameter_file
is a directory, checksum of all files inside the directory is calculated.
- close()¶
- coordinates = None¶
- create_field_info()¶
- default_field = ('gas', 'density')¶
- default_fluid_type = 'gas'¶
- default_units = {'length_unit': 'cm', 'magnetic_unit': 'gauss', 'mass_unit': 'g', 'temperature_unit': 'K', 'time_unit': 's', 'velocity_unit': 'cm/s'}¶
- define_unit(symbol, value, tex_repr=None, offset=None, prefixable=False)¶
Define a new unit and add it to the dataset’s unit registry.
- Parameters:
symbol (string) – The symbol for the new unit.
value (tuple or YTQuantity) – The definition of the new unit in terms of some other units. For example, one would define a new “mph” unit with (1.0, “mile/hr”)
tex_repr (string, optional) – The LaTeX representation of the new unit. If one is not supplied, it will be generated automatically based on the symbol string.
offset (float, optional) – The default offset for the unit. If not set, an offset of 0 is assumed.
prefixable (bool, optional) – Whether or not the new unit can use SI prefixes. Default: False
Examples
>>> ds.define_unit("mph", (1.0, "mile/hr")) >>> two_weeks = YTQuantity(14.0, "days") >>> ds.define_unit("fortnight", two_weeks)
- property derived_field_list¶
- property directory¶
- domain_offset = array([0, 0, 0])¶
- property field_list¶
- property fields¶
- fields_detected = False¶
- find_field_values_at_point(fields, coords)¶
Returns the values [field1, field2,…] of the fields at the given coordinates. Returns a list of field values in the same order as the input fields.
- find_field_values_at_points(fields, coords)¶
Returns the values [field1, field2,…] of the fields at the given [(x1, y1, z2), (x2, y2, z2),…] points. Returns a list of field values in the same order as the input fields.
- find_max(field, source=None, to_array=True)¶
Returns (value, location) of the maximum of a given field.
This is a wrapper around _find_extremum
- find_min(field, source=None, to_array=True)¶
Returns (value, location) for the minimum of a given field.
This is a wrapper around _find_extremum
- force_periodicity(val=True)¶
Override box periodicity to (True, True, True). Use ds.force_periodicty(False) to use the actual box periodicity.
- property fullpath¶
- get_smallest_appropriate_unit(v, quantity='distance', return_quantity=False)¶
Returns the largest whole unit smaller than the YTQuantity passed to it as a string.
The quantity keyword can be equal to distance or time. In the case of distance, the units are: ‘Mpc’, ‘kpc’, ‘pc’, ‘au’, ‘rsun’, ‘km’, etc. For time, the units are: ‘Myr’, ‘kyr’, ‘yr’, ‘day’, ‘hr’, ‘s’, ‘ms’, etc.
If return_quantity is set to True, it finds the largest YTQuantity object with a whole unit and a power of ten as the coefficient, and it returns this YTQuantity.
- get_unit_from_registry(unit_str)¶
Creates a unit object matching the string expression, using this dataset’s unit registry.
- Parameters:
unit_str (str) – string that we can parse for a sympy Expr.
- has_key(key)¶
Checks units, parameters, and conversion factors. Returns a boolean.
- property index¶
- property ires_factor¶
- known_filters: dict[ParticleType, ParticleFilter] | None = None¶
- property max_level¶
- property min_level¶
- property parameter_filename¶
- property particle_fields_by_type¶
- property particle_type_counts¶
- particle_unions: dict[ParticleType, ParticleUnion] | None = None¶
- property particles_exist¶
- property periodicity¶
- print_key_parameters()¶
- print_stats()¶
- property quan¶
Converts an scalar into a
yt.units.yt_array.YTQuantity
The returned YTQuantity will be dimensionless by default, but can be cast to arbitrary units using the
units
keyword argument.- Parameters:
input_scalar (an integer or floating point scalar) – The scalar to attach units to
units (String unit specification, unit symbol or astropy object) – The units of the quantity. Powers must be specified using python syntax (cm**3, not cm^3).
input_units (Deprecated in favor of 'units')
dtype (string or NumPy dtype object) – The dtype of the array data.
Examples
>>> import yt >>> ds = yt.load("IsolatedGalaxy/galaxy0030/galaxy0030")
>>> a = ds.quan(1, "cm") >>> b = ds.quan(2, "m") >>> a + b 201.0 cm >>> b + a 2.01 m
Quantities created this way automatically know about the unit system of the dataset.
>>> a = ds.quan(5, "code_length") >>> a.in_cgs() 1.543e+25 cm
- relative_refinement(l0, l1)¶
- set_code_units()¶
- set_field_label_format(format_property, value)¶
Set format properties for how fields will be written out. Accepts
format_property : string indicating what property to set value: the value to set for that format_property
- set_units()¶
Creates the unit registry for this dataset.
- setup_cosmology()¶
If this dataset is cosmological, add a cosmology object.
- setup_deprecated_fields()¶
- storage_filename = None¶
- property units¶
- class yt.frontends.fits.data_structures.SpectralCubeFITSHierarchy(ds, dataset_type='fits')[source]¶
Bases:
FITSHierarchy
- clear_all_data()¶
This routine clears all the data currently being held onto by the grids and the data io handler.
- comm = None¶
- convert(unit)¶
- float_type = 'float64'¶
- get_data(node, name)¶
Return the dataset with a given name located at node in the datafile.
- get_dependencies(fields)¶
- get_levels()¶
- get_smallest_dx()¶
Returns (in code units) the smallest cell size in the simulation.
- property grid_corners¶
- lock_grids_to_parents()¶
This function locks grid edges to their parents.
This is useful in cases where the grid structure may be somewhat irregular, or where setting the left and right edges is a lossy process. It is designed to correct situations where left/right edges may be set slightly incorrectly, resulting in discontinuities in images and the like.
- property parameters¶
- partition_index_2d(axis)¶
- partition_index_3d(ds, padding=0.0, rank_ratio=1)¶
- partition_index_3d_bisection_list()¶
Returns an array that is used to drive _partition_index_3d_bisection, below.
- partition_region_3d(left_edge, right_edge, padding=0.0, rank_ratio=1)¶
Given a region, it subdivides it into smaller regions for parallel analysis.
- print_stats()¶
Prints out (stdout) relevant information about the simulation
- save_data(array, node, name, set_attr=None, force=False, passthrough=False)¶
Arbitrary numpy data will be saved to the region in the datafile described by node and name. If data file does not exist, it throws no error and simply does not save.
- select_grids(level)¶
Returns an array of grids at level.
- class yt.frontends.fits.data_structures.YTFITSDataset(filename=None, *args, **kwargs)[source]¶
Bases:
FITSDataset
- add_deposited_particle_field(deposit_field, method, kernel_name='cubic', weight_field=None)¶
Add a new deposited particle field
Creates a new deposited field based on the particle deposit_field.
- Parameters:
deposit_field (tuple) – The field name tuple of the particle field the deposited field will be created from. This must be a field name tuple so yt can appropriately infer the correct particle type.
method (string) – This is the “method name” which will be looked up in the particle_deposit namespace as methodname_deposit. Current methods include simple_smooth, sum, std, cic, weighted_mean, nearest and count.
kernel_name (string, default 'cubic') – This is the name of the smoothing kernel to use. It is only used for the simple_smooth method and is otherwise ignored. Current supported kernel names include cubic, quartic, quintic, wendland2, wendland4, and wendland6.
weight_field ((field_type, field_name) or None) – Weighting field name for deposition method weighted_mean. If None, use the particle mass.
- Return type:
The field name tuple for the newly created field.
- add_field(name, function, sampling_type, *, force_override=False, **kwargs)¶
Dataset-specific call to add_field
Add a new field, along with supplemental metadata, to the list of available fields. This respects a number of arguments, all of which are passed on to the constructor for
DerivedField
.- Parameters:
name (str) – is the name of the field.
function (callable) – A function handle that defines the field. Should accept arguments (field, data)
sampling_type (str) – “cell” or “particle” or “local”
force_override (bool) – If False (default), an error will be raised if a field of the same name already exists.
units (str) – A plain text string encoding the unit. Powers must be in python syntax (** instead of ^).
take_log (bool) – Describes whether the field should be logged
validators (list) – A list of
FieldValidator
objectsvector_field (bool) – Describes the dimensionality of the field. Currently unused.
display_name (str) – A name used in the plots
force_override – Whether to override an existing derived field. Does not work with on-disk fields.
- add_gradient_fields(fields=None)¶
Add gradient fields.
Creates four new grid-based fields that represent the components of the gradient of an existing field, plus an extra field for the magnitude of the gradient. The gradient is computed using second-order centered differences.
- Parameters:
fields (str or tuple(str, str), or a list of the previous) – Label(s) for at least one field. Can either represent a tuple (<field type>, <field fname>) or simply the field name. Warning: several field types may match the provided field name, in which case the first one discovered internally is used.
- Return type:
A list of field name tuples for the newly created fields.
- Raises:
YTFieldNotParsable – If fields are not parsable to yt field keys.
YTFieldNotFound : – If at least one field can not be identified.
Examples
>>> grad_fields = ds.add_gradient_fields(("gas", "density")) >>> print(grad_fields) ... [ ... ("gas", "density_gradient_x"), ... ("gas", "density_gradient_y"), ... ("gas", "density_gradient_z"), ... ("gas", "density_gradient_magnitude"), ... ]
Note that the above example assumes ds.geometry == ‘cartesian’. In general, the function will create gradient components along the axes of the dataset coordinate system. For instance, with cylindrical data, one gets ‘density_gradient_<r,theta,z>’
- add_mesh_sampling_particle_field(sample_field, ptype='all')¶
Add a new mesh sampling particle field
Creates a new particle field which has the value of the deposit_field at the location of each particle of type ptype.
- Parameters:
sample_field (tuple) – The field name tuple of the mesh field to be deposited onto the particles. This must be a field name tuple so yt can appropriately infer the correct particle type.
ptype (string, default 'all') – The particle type onto which the deposition will occur.
- Return type:
The field name tuple for the newly created field.
Examples
>>> ds = yt.load("output_00080/info_00080.txt") ... ds.add_mesh_sampling_particle_field(("gas", "density"), ptype="all")
>>> print("The density at the location of the particle is:") ... print(ds.r["all", "cell_gas_density"]) The density at the location of the particle is: [9.33886124e-30 1.22174333e-28 1.20402333e-28 ... 2.77410331e-30 8.79467609e-31 3.50665136e-30] g/cm**3
>>> len(ds.r["all", "cell_gas_density"]) == len(ds.r["all", "particle_ones"]) True
- add_particle_filter(filter)¶
Add particle filter to the dataset.
Add
filter
to the dataset and set up relevant derived_field. It will also add anyfiltered_type
that thefilter
depends on.
- add_particle_union(union)¶
- all_data(find_max=False, **kwargs)¶
all_data is a wrapper to the Region object for creating a region which covers the entire simulation domain.
- property arr¶
Converts an array into a
yt.units.yt_array.YTArray
The returned YTArray will be dimensionless by default, but can be cast to arbitrary units using the
units
keyword argument.- Parameters:
input_array (Iterable) – A tuple, list, or array to attach units to
units (String unit specification, unit symbol or astropy object) – The units of the array. Powers must be specified using python syntax (cm**3, not cm^3).
input_units (Deprecated in favor of 'units')
dtype (string or NumPy dtype object) – The dtype of the returned array data
Examples
>>> import yt >>> import numpy as np >>> ds = yt.load("IsolatedGalaxy/galaxy0030/galaxy0030") >>> a = ds.arr([1, 2, 3], "cm") >>> b = ds.arr([4, 5, 6], "m") >>> a + b YTArray([ 401., 502., 603.]) cm >>> b + a YTArray([ 4.01, 5.02, 6.03]) m
Arrays returned by this function know about the dataset’s unit system
>>> a = ds.arr(np.ones(5), "code_length") >>> a.in_units("Mpccm/h") YTArray([ 1.00010449, 1.00010449, 1.00010449, 1.00010449, 1.00010449]) Mpc
- property backup_filename¶
- property basename¶
- box(left_edge, right_edge, **kwargs)¶
box is a wrapper to the Region object for creating a region without having to specify a center value. It assumes the center is the midpoint between the left_edge and right_edge.
Keyword arguments are passed to the initializer of the YTRegion object (e.g. ds.region).
- property checksum¶
Computes md5 sum of a dataset.
Note: Currently this property is unable to determine a complete set of files that are a part of a given dataset. As a first approximation, the checksum of
parameter_file
is calculated. In caseparameter_file
is a directory, checksum of all files inside the directory is calculated.
- close()¶
- coordinates = None¶
- create_field_info()¶
- default_field = ('gas', 'density')¶
- default_fluid_type = 'gas'¶
- default_units = {'length_unit': 'cm', 'magnetic_unit': 'gauss', 'mass_unit': 'g', 'temperature_unit': 'K', 'time_unit': 's', 'velocity_unit': 'cm/s'}¶
- define_unit(symbol, value, tex_repr=None, offset=None, prefixable=False)¶
Define a new unit and add it to the dataset’s unit registry.
- Parameters:
symbol (string) – The symbol for the new unit.
value (tuple or YTQuantity) – The definition of the new unit in terms of some other units. For example, one would define a new “mph” unit with (1.0, “mile/hr”)
tex_repr (string, optional) – The LaTeX representation of the new unit. If one is not supplied, it will be generated automatically based on the symbol string.
offset (float, optional) – The default offset for the unit. If not set, an offset of 0 is assumed.
prefixable (bool, optional) – Whether or not the new unit can use SI prefixes. Default: False
Examples
>>> ds.define_unit("mph", (1.0, "mile/hr")) >>> two_weeks = YTQuantity(14.0, "days") >>> ds.define_unit("fortnight", two_weeks)
- property derived_field_list¶
- property directory¶
- domain_offset = array([0, 0, 0])¶
- property field_list¶
- property fields¶
- fields_detected = False¶
- find_field_values_at_point(fields, coords)¶
Returns the values [field1, field2,…] of the fields at the given coordinates. Returns a list of field values in the same order as the input fields.
- find_field_values_at_points(fields, coords)¶
Returns the values [field1, field2,…] of the fields at the given [(x1, y1, z2), (x2, y2, z2),…] points. Returns a list of field values in the same order as the input fields.
- find_max(field, source=None, to_array=True)¶
Returns (value, location) of the maximum of a given field.
This is a wrapper around _find_extremum
- find_min(field, source=None, to_array=True)¶
Returns (value, location) for the minimum of a given field.
This is a wrapper around _find_extremum
- force_periodicity(val=True)¶
Override box periodicity to (True, True, True). Use ds.force_periodicty(False) to use the actual box periodicity.
- property fullpath¶
- get_smallest_appropriate_unit(v, quantity='distance', return_quantity=False)¶
Returns the largest whole unit smaller than the YTQuantity passed to it as a string.
The quantity keyword can be equal to distance or time. In the case of distance, the units are: ‘Mpc’, ‘kpc’, ‘pc’, ‘au’, ‘rsun’, ‘km’, etc. For time, the units are: ‘Myr’, ‘kyr’, ‘yr’, ‘day’, ‘hr’, ‘s’, ‘ms’, etc.
If return_quantity is set to True, it finds the largest YTQuantity object with a whole unit and a power of ten as the coefficient, and it returns this YTQuantity.
- get_unit_from_registry(unit_str)¶
Creates a unit object matching the string expression, using this dataset’s unit registry.
- Parameters:
unit_str (str) – string that we can parse for a sympy Expr.
- has_key(key)¶
Checks units, parameters, and conversion factors. Returns a boolean.
- property index¶
- property ires_factor¶
- known_filters: dict[ParticleType, ParticleFilter] | None = None¶
- property max_level¶
- property min_level¶
- property parameter_filename¶
- property particle_fields_by_type¶
- property particle_type_counts¶
- particle_unions: dict[ParticleType, ParticleUnion] | None = None¶
- property particles_exist¶
- property periodicity¶
- print_key_parameters()¶
- print_stats()¶
- property quan¶
Converts an scalar into a
yt.units.yt_array.YTQuantity
The returned YTQuantity will be dimensionless by default, but can be cast to arbitrary units using the
units
keyword argument.- Parameters:
input_scalar (an integer or floating point scalar) – The scalar to attach units to
units (String unit specification, unit symbol or astropy object) – The units of the quantity. Powers must be specified using python syntax (cm**3, not cm^3).
input_units (Deprecated in favor of 'units')
dtype (string or NumPy dtype object) – The dtype of the array data.
Examples
>>> import yt >>> ds = yt.load("IsolatedGalaxy/galaxy0030/galaxy0030")
>>> a = ds.quan(1, "cm") >>> b = ds.quan(2, "m") >>> a + b 201.0 cm >>> b + a 2.01 m
Quantities created this way automatically know about the unit system of the dataset.
>>> a = ds.quan(5, "code_length") >>> a.in_cgs() 1.543e+25 cm
- relative_refinement(l0, l1)¶
- set_code_units()¶
- set_field_label_format(format_property, value)¶
Set format properties for how fields will be written out. Accepts
format_property : string indicating what property to set value: the value to set for that format_property
- set_units()¶
Creates the unit registry for this dataset.
- setup_cosmology()¶
If this dataset is cosmological, add a cosmology object.
- setup_deprecated_fields()¶
- storage_filename = None¶
- property units¶