data
Load DICOM datasets as numpy arrays with voxel dimensions
CT scans in DiffDRR are stored using the torchio.Subject dataclass. torchio provides a convenient and consistent mechanism for reading volumes from a variety of formats and orientations. We canonicalize all volumes to the RAS+ coordinate space. In addition to reading an input volume, you can also pass the following to diffdrr.data.read when loading a subject:
labelmap: a 3D segmentation of the input volumelabels: a subset of structures from the labelmap that you want to renderorientation: a frame-of-reference change for the C-arm (currently, “AP” and “PA” are supported)bone_attenuation_multiplier: a constant multiplier to the estimated density of bone voxelsfiducials: a tensor of 3D fiducial marks in world coordinates**kwargs: any additional kwargs can be passed to thetorchio.Subjectand accessed as a dictionary
load_example_ct
def load_example_ct(
labels:NoneType=None, orientation:str='AP', bone_attenuation_multiplier:float=1.0, kwargs:VAR_KEYWORD
)->Subject:
Load an example chest CT for demonstration purposes.
read
def read(
volume:str | Path | ScalarImage, # CT volume
labelmap:str | Path | LabelMap=None, # Labelmap for the CT volume
labels:int | list=None, # Labels from the mask of structures to render
orientation:str | None='AP', # Frame-of-reference change
bone_attenuation_multiplier:float=1.0, # Scalar multiplier on density of high attenuation voxels
fiducials:torch.Tensor=None, # 3D fiducials in world coordinates
transform:RigidTransform=None, # RigidTransform to apply to the volume's affine
center_volume:bool=True, # Move the volume's isocenter to the world origin
resample_target:NoneType=None, # Resampling resolution argument passed to torchio.transforms.Resample
kwargs:VAR_KEYWORD
)->Subject: # Any additional information to be stored in the torchio.Subject
Read an image volume from a variety of formats, and optionally, any given labelmap for the volume. Converts volume to a RAS+ coordinate system and moves the volume isocenter to the world origin.