workflows

Module: workflows.align

ResliceFlow([output_strategy, mix_names, ...])

Methods

Workflow([output_strategy, mix_names, ...])

Methods

load_nifti(fname[, return_img, ...])
reslice(data, affine, zooms, new_zooms[, ...]) Reslice data with new voxel resolution defined by new_zooms
save_nifti(fname, data, affine[, hdr])

Module: workflows.base

IntrospectiveArgumentParser([prog, usage, ...])

Methods

NumpyDocString(docstring[, config])
get_args_default(func)

Module: workflows.combined_workflow

CombinedWorkflow([output_strategy, ...])

Methods

Workflow([output_strategy, mix_names, ...])

Methods

iteritems(d, **kw) Return an iterator over the (key, value) pairs of a dictionary.

Module: workflows.denoise

NLMeansFlow([output_strategy, mix_names, ...])

Methods

Workflow([output_strategy, mix_names, ...])

Methods

estimate_sigma(arr[, ...]) Standard deviation estimation from local patches
nlmeans(arr, sigma[, mask, patch_radius, ...]) Non-local means for denoising 3D and 4D images

Module: workflows.docstring_parser

This was taken directly from the file docscrape.py of numpydoc package.

Copyright (C) 2008 Stefan van der Walt <stefan@mentat.za.net>, Pauli Virtanen <pav@iki.fi>

Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:

  1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
  2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.

THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS’’ AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

NumpyDocString(docstring[, config])
Reader(data) A line-based string reader.
dedent_lines(lines) Deindent a list of lines maximally
warn Issue a warning, or maybe ignore it or raise an exception.

Module: workflows.flow_runner

IntrospectiveArgumentParser([prog, usage, ...])

Methods

get_level(lvl) Transforms the loggin level passed on the commandline into a proper logging level name.
iteritems(d, **kw) Return an iterator over the (key, value) pairs of a dictionary.
run_flow(flow) Wraps the process of building an argparser that reflects the workflow that we want to run along with some generic parameters like logging, force and output strategies.

Module: workflows.io

IoInfoFlow([output_strategy, mix_names, ...])

Methods

Workflow([output_strategy, mix_names, ...])

Methods

load_nifti(fname[, return_img, ...])

Module: workflows.mask

MaskFlow([output_strategy, mix_names, ...])

Methods

Workflow([output_strategy, mix_names, ...])

Methods

load_nifti(fname[, return_img, ...])
save_nifti(fname, data, affine[, hdr])

Module: workflows.multi_io

IOIterator([output_strategy, mix_names]) Create output filenames that work nicely with multiple input files from
basename_without_extension(fname)
common_start(sa, sb) Returns the longest common substring from the beginning of sa and sb
concatenate_inputs(multi_inputs) Concatenate list of inputs
connect_output_paths(inputs, out_dir, out_files) Generates a list of output files paths based on input files and output strategies.
get_args_default(func)
glob(pathname, *[, recursive]) Return a list of paths matching a pathname pattern.
io_iterator(inputs, out_dir, fnames[, ...]) Creates an IOIterator from the parameters.
io_iterator_(frame, fnc[, output_strategy, ...]) Creates an IOIterator using introspection.
slash_to_under(dir_str)

Module: workflows.reconst

ConstrainedSphericalDeconvModel(gtab, response)

Methods

CsaOdfModel(gtab, sh_order[, smooth, ...]) Implementation of Constant Solid Angle reconstruction method.
DiffusionKurtosisModel(gtab[, fit_method]) Class for the Diffusion Kurtosis Model
ReconstCSAFlow([output_strategy, mix_names, ...])

Methods

ReconstCSDFlow([output_strategy, mix_names, ...])

Methods

ReconstDkiFlow([output_strategy, mix_names, ...])

Methods

ReconstDtiFlow([output_strategy, mix_names, ...])

Methods

ReconstMAPMRIFlow([output_strategy, ...])

Methods

TensorModel(gtab[, fit_method, return_S0_hat]) Diffusion Tensor
Workflow([output_strategy, mix_names, ...])

Methods

auto_response(gtab, data[, roi_center, ...]) Automatic estimation of response function using FA.
axial_diffusivity(evals[, axis]) Axial Diffusivity (AD) of a diffusion tensor.
color_fa(fa, evecs) Color fractional anisotropy of diffusion tensor
fractional_anisotropy(evals[, axis]) Fractional anisotropy (FA) of a diffusion tensor.
geodesic_anisotropy(evals[, axis]) Geodesic anisotropy (GA) of a diffusion tensor.
get_mode(q_form) Mode (MO) of a diffusion tensor [R507].
get_sphere([name]) provide triangulated spheres
gradient_table(bvals[, bvecs, big_delta, ...]) A general function for creating diffusion MR gradients.
literal_eval(node_or_string) Safely evaluate an expression node or a string containing a Python expression.
lower_triangular(tensor[, b0]) Returns the six lower triangular values of the tensor and a dummy variable
mean_diffusivity(evals[, axis]) Mean Diffusivity (MD) of a diffusion tensor.
peaks_from_model(model, data, sphere, ...[, ...]) Fit the model to data and computes peaks and metrics
peaks_to_niftis(pam, fname_shm, fname_dirs, ...) Save SH, directions, indices and values of peaks to Nifti.
radial_diffusivity(evals[, axis]) Radial Diffusivity (RD) of a diffusion tensor.
read_bvals_bvecs(fbvals, fbvecs) Read b-values and b-vectors from disk
save_peaks(fname, pam[, affine, verbose]) Save all important attributes of object PeaksAndMetrics in a PAM5 file (HDF5).
split_dki_param(dki_params) Extract the diffusion tensor eigenvalues, the diffusion tensor

Module: workflows.segment

MedianOtsuFlow([output_strategy, mix_names, ...])

Methods

Workflow([output_strategy, mix_names, ...])

Methods

load_nifti(fname[, return_img, ...])
median_otsu(input_volume[, median_radius, ...]) Simple brain extraction tool method for images from DWI data.
save_nifti(fname, data, affine[, hdr])

Module: workflows.tracking

DetTrackPAMFlow([output_strategy, ...])

Methods

DeterministicMaximumDirectionGetter Return direction of a sphere with the highest probability mass function (pmf).
GenericTrackFlow([output_strategy, ...])

Methods

LocalTracking(direction_getter, ...[, ...])
ThresholdTissueClassifier # Declarations from tissue_classifier.pxd bellow
Tractogram([streamlines, ...]) Container for streamlines and their data information.
Workflow([output_strategy, mix_names, ...])

Methods

load_nifti(fname[, return_img, ...])
load_peaks(fname[, verbose]) Load a PeaksAndMetrics HDF5 file (PAM5)
save(tractogram, filename, **kwargs) Saves a tractogram to a file.

Module: workflows.workflow

Workflow([output_strategy, mix_names, ...])

Methods

io_iterator_(frame, fnc[, output_strategy, ...]) Creates an IOIterator using introspection.

ResliceFlow

class dipy.workflows.align.ResliceFlow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: dipy.workflows.workflow.Workflow

Methods

get_io_iterator() Create an iterator for IO.
get_short_name()
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run(input_files, new_vox_size[, order, ...]) Reslice data with new voxel resolution defined by new_vox_sz
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

classmethod get_short_name()
run(input_files, new_vox_size, order=1, mode='constant', cval=0, num_processes=1, out_dir='', out_resliced='resliced.nii.gz')

Reslice data with new voxel resolution defined by new_vox_sz

Parameters:

input_files : string

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

new_vox_size : variable float

new voxel size

order : int, optional

order of interpolation, from 0 to 5, for resampling/reslicing, 0 nearest interpolation, 1 trilinear etc.. if you don’t want any smoothing 0 is the option you need (default 1)

mode : string, optional

Points outside the boundaries of the input are filled according to the given mode ‘constant’, ‘nearest’, ‘reflect’ or ‘wrap’ (default ‘constant’)

cval : float, optional

Value used for points outside the boundaries of the input if mode=’constant’ (default 0)

num_processes : int, optional

Split the calculation to a pool of children processes. This only applies to 4D data arrays. If a positive integer then it defines the size of the multiprocessing pool that will be used. If 0, then the size of the pool will equal the number of cores available. (default 1)

out_dir : string, optional

Output directory (default input file directory)

out_resliced : string, optional

Name of the resliced dataset to be saved (default ‘resliced.nii.gz’)

Workflow

class dipy.workflows.align.Workflow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: object

Methods

get_io_iterator() Create an iterator for IO.
get_short_name() A short name for the workflow used to subdivide
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run() Since this is an abstract class, raise exception if this code is
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

get_io_iterator()

Create an iterator for IO.

Use a couple of inspection tricks to build an IOIterator using the previous frame (values of local variables and other contextuals) and the run method’s docstring.

classmethod get_short_name()

A short name for the workflow used to subdivide

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

get_sub_runs()

No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

If it is bound to happen, an action is taken depending on self._force_overwrite (or –force via command line). A log message is output independently of the outcome to tell the user something happened.

run()

Since this is an abstract class, raise exception if this code is reached (not impletemented in child class or literally called on this class)

load_nifti

dipy.workflows.align.load_nifti(fname, return_img=False, return_voxsize=False, return_coords=False)

reslice

dipy.workflows.align.reslice(data, affine, zooms, new_zooms, order=1, mode='constant', cval=0, num_processes=1)

Reslice data with new voxel resolution defined by new_zooms

Parameters:

data : array, shape (I,J,K) or (I,J,K,N)

3d volume or 4d volume with datasets

affine : array, shape (4,4)

mapping from voxel coordinates to world coordinates

zooms : tuple, shape (3,)

voxel size for (i,j,k) dimensions

new_zooms : tuple, shape (3,)

new voxel size for (i,j,k) after resampling

order : int, from 0 to 5

order of interpolation for resampling/reslicing, 0 nearest interpolation, 1 trilinear etc.. if you don’t want any smoothing 0 is the option you need.

mode : string (‘constant’, ‘nearest’, ‘reflect’ or ‘wrap’)

Points outside the boundaries of the input are filled according to the given mode.

cval : float

Value used for points outside the boundaries of the input if mode=’constant’.

num_processes : int

Split the calculation to a pool of children processes. This only applies to 4D data arrays. If a positive integer then it defines the size of the multiprocessing pool that will be used. If 0, then the size of the pool will equal the number of cores available.

Returns:

data2 : array, shape (I,J,K) or (I,J,K,N)

datasets resampled into isotropic voxel size

affine2 : array, shape (4,4)

new affine for the resampled image

Examples

>>> import nibabel as nib
>>> from dipy.align.reslice import reslice
>>> from dipy.data import get_data
>>> fimg = get_data('aniso_vox')
>>> img = nib.load(fimg)
>>> data = img.get_data()
>>> data.shape == (58, 58, 24)
True
>>> affine = img.affine
>>> zooms = img.header.get_zooms()[:3]
>>> zooms
(4.0, 4.0, 5.0)
>>> new_zooms = (3.,3.,3.)
>>> new_zooms
(3.0, 3.0, 3.0)
>>> data2, affine2 = reslice(data, affine, zooms, new_zooms)
>>> data2.shape == (77, 77, 40)
True

save_nifti

dipy.workflows.align.save_nifti(fname, data, affine, hdr=None)

IntrospectiveArgumentParser

class dipy.workflows.base.IntrospectiveArgumentParser(prog=None, usage=None, description=None, epilog=None, version=None, parents=[], formatter_class=<class 'dipy.fixes.argparse.RawTextHelpFormatter'>, prefix_chars='-', fromfile_prefix_chars=None, argument_default=None, conflict_handler='resolve', add_help=True)

Bases: dipy.fixes.argparse.ArgumentParser

Methods

add_argument(dest, ...[, name, name])
add_argument_group(*args, **kwargs)
add_description()
add_epilogue()
add_mutually_exclusive_group(**kwargs)
add_sub_flow_args(sub_flows) Take an array of workflow objects and use introspection to extract the parameters, types and docstrings of their run method.
add_subparsers(**kwargs)
add_workflow(workflow) Take a workflow object and use introspection to extract the parameters, types and docstrings of its run method.
error(message: string) Prints a usage message incorporating the message to stderr and exits.
exit([status, message])
format_help()
format_usage()
format_version()
get_flow_args([args, namespace]) Returns the parsed arguments as a dictionary that will be used as a workflow’s run method arguments.
get_outputs()
parse_args([args, namespace])
parse_known_args([args, namespace])
print_help([file])
print_usage([file])
print_version([file])
register(registry_name, value, object)
set_defaults(**kwargs)
show_argument(dest)
update_argument(*args, **kargs)
__init__(prog=None, usage=None, description=None, epilog=None, version=None, parents=[], formatter_class=<class 'dipy.fixes.argparse.RawTextHelpFormatter'>, prefix_chars='-', fromfile_prefix_chars=None, argument_default=None, conflict_handler='resolve', add_help=True)

Augmenting the argument parser to allow automatic creation of arguments from workflows

Parameters:

prog : None

The name of the program (default: sys.argv[0])

usage : None

A usage message (default: auto-generated from arguments)

description : str

A description of what the program does

epilog : str

Text following the argument descriptions

version : None

Add a -v/–version option with the given version string

parents : list

Parsers whose arguments should be copied into this one

formatter_class : obj

HelpFormatter class for printing help messages

prefix_chars : str

Characters that prefix optional arguments

fromfile_prefix_chars : None

Characters that prefix files containing additional arguments

argument_default : None

The default value for all arguments

conflict_handler : str

String indicating how to handle conflicts

add_help : bool

Add a -h/-help option

add_description()
add_epilogue()
add_sub_flow_args(sub_flows)

Take an array of workflow objects and use introspection to extract the parameters, types and docstrings of their run method. Only the optional input parameters are extracted for these as they are treated as sub workflows.

Parameters:

sub_flows : array of dipy.workflows.workflow.Workflow

Workflows to inspect.

Returns:

sub_flow_optionals : dictionary of all sub workflow optional parameters

add_workflow(workflow)

Take a workflow object and use introspection to extract the parameters, types and docstrings of its run method. Then add these parameters to the current arparser’s own params to parse. If the workflow is of type combined_workflow, the optional input parameters of its sub workflows will also be added.

Parameters:

workflow : dipy.workflows.workflow.Workflow

Workflow from which to infer parameters.

Returns:

sub_flow_optionals : dictionary of all sub workflow optional parameters

get_flow_args(args=None, namespace=None)

Returns the parsed arguments as a dictionary that will be used as a workflow’s run method arguments.

get_outputs()
show_argument(dest)
update_argument(*args, **kargs)

NumpyDocString

class dipy.workflows.base.NumpyDocString(docstring, config={})

Bases: object

__init__(docstring, config={})

get_args_default

dipy.workflows.base.get_args_default(func)

CombinedWorkflow

class dipy.workflows.combined_workflow.CombinedWorkflow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: dipy.workflows.workflow.Workflow

Methods

get_io_iterator() Create an iterator for IO.
get_optionals(flow, **kwargs) Returns the sub flow’s optional arguments merged with those passed as params in kwargs.
get_short_name() A short name for the workflow used to subdivide
get_sub_runs() Returns a list of tuples (sub flow name, sub flow run method, sub flow short name) to be used in the sub flow parameters extraction.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run() Since this is an abstract class, raise exception if this code is
run_sub_flow(flow, *args, **kwargs) Runs the sub flow with the optional parameters passed via the command line.
set_sub_flows_optionals(opts) Sets the self._optionals variable with all sub flow arguments that were passed in the commandline.
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

Workflow that combines multiple workflows. The workflow combined together are referred as sub flows in this class.

get_optionals(flow, **kwargs)

Returns the sub flow’s optional arguments merged with those passed as params in kwargs.

get_sub_runs()

Returns a list of tuples (sub flow name, sub flow run method, sub flow short name) to be used in the sub flow parameters extraction.

run_sub_flow(flow, *args, **kwargs)

Runs the sub flow with the optional parameters passed via the command line. This is a convenience method to make sub flow running more intuitive on the concrete CombinedWorkflow side.

set_sub_flows_optionals(opts)

Sets the self._optionals variable with all sub flow arguments that were passed in the commandline.

Workflow

class dipy.workflows.combined_workflow.Workflow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: object

Methods

get_io_iterator() Create an iterator for IO.
get_short_name() A short name for the workflow used to subdivide
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run() Since this is an abstract class, raise exception if this code is
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

get_io_iterator()

Create an iterator for IO.

Use a couple of inspection tricks to build an IOIterator using the previous frame (values of local variables and other contextuals) and the run method’s docstring.

classmethod get_short_name()

A short name for the workflow used to subdivide

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

get_sub_runs()

No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

If it is bound to happen, an action is taken depending on self._force_overwrite (or –force via command line). A log message is output independently of the outcome to tell the user something happened.

run()

Since this is an abstract class, raise exception if this code is reached (not impletemented in child class or literally called on this class)

iteritems

dipy.workflows.combined_workflow.iteritems(d, **kw)

Return an iterator over the (key, value) pairs of a dictionary.

NLMeansFlow

class dipy.workflows.denoise.NLMeansFlow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: dipy.workflows.workflow.Workflow

Methods

get_io_iterator() Create an iterator for IO.
get_short_name()
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run(input_files[, sigma, out_dir, out_denoised]) Workflow wrapping the nlmeans denoising method.
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

classmethod get_short_name()
run(input_files, sigma=0, out_dir='', out_denoised='dwi_nlmeans.nii.gz')

Workflow wrapping the nlmeans denoising method.

It applies nlmeans denoise on each file found by ‘globing’ input_files and saves the results in a directory specified by out_dir.

Parameters:

input_files : string

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

sigma : float, optional

Sigma parameter to pass to the nlmeans algorithm (default: auto estimation).

out_dir : string, optional

Output directory (default input file directory)

out_denoised : string, optional

Name of the resuting denoised volume (default: dwi_nlmeans.nii.gz)

Workflow

class dipy.workflows.denoise.Workflow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: object

Methods

get_io_iterator() Create an iterator for IO.
get_short_name() A short name for the workflow used to subdivide
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run() Since this is an abstract class, raise exception if this code is
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

get_io_iterator()

Create an iterator for IO.

Use a couple of inspection tricks to build an IOIterator using the previous frame (values of local variables and other contextuals) and the run method’s docstring.

classmethod get_short_name()

A short name for the workflow used to subdivide

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

get_sub_runs()

No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

If it is bound to happen, an action is taken depending on self._force_overwrite (or –force via command line). A log message is output independently of the outcome to tell the user something happened.

run()

Since this is an abstract class, raise exception if this code is reached (not impletemented in child class or literally called on this class)

estimate_sigma

dipy.workflows.denoise.estimate_sigma(arr, disable_background_masking=False, N=0)

Standard deviation estimation from local patches

Parameters:

arr : 3D or 4D ndarray

The array to be estimated

disable_background_masking : bool, default False

If True, uses all voxels for the estimation, otherwise, only non-zeros voxels are used. Useful if the background is masked by the scanner.

N : int, default 0

Number of coils of the receiver array. Use N = 1 in case of a SENSE reconstruction (Philips scanners) or the number of coils for a GRAPPA reconstruction (Siemens and GE). Use 0 to disable the correction factor, as for example if the noise is Gaussian distributed. See [1] for more information.

Returns:

sigma : ndarray

standard deviation of the noise, one estimation per volume.

nlmeans

dipy.workflows.denoise.nlmeans(arr, sigma, mask=None, patch_radius=1, block_radius=5, rician=True, num_threads=None)

Non-local means for denoising 3D and 4D images

Parameters:

arr : 3D or 4D ndarray

The array to be denoised

mask : 3D ndarray

sigma : float or 3D array

standard deviation of the noise estimated from the data

patch_radius : int

patch size is 2 x patch_radius + 1. Default is 1.

block_radius : int

block size is 2 x block_radius + 1. Default is 5.

rician : boolean

If True the noise is estimated as Rician, otherwise Gaussian noise is assumed.

num_threads : int

Number of threads. If None (default) then all available threads will be used (all CPU cores).

Returns:

denoised_arr : ndarray

the denoised arr which has the same shape as arr.

References

[Descoteaux08]Descoteaux, Maxim and Wiest-Daessle`, Nicolas and Prima, Sylvain and Barillot, Christian and Deriche, Rachid Impact of Rician Adapted Non-Local Means Filtering on HARDI, MICCAI 2008

NumpyDocString

class dipy.workflows.docstring_parser.NumpyDocString(docstring, config={})

Bases: object

__init__(docstring, config={})

Reader

class dipy.workflows.docstring_parser.Reader(data)

Bases: object

A line-based string reader.

Methods

eof()
is_empty()
peek([n])
read()
read_to_condition(condition_func)
read_to_next_empty_line()
read_to_next_unindented_line()
reset()
seek_next_non_empty_line()
__init__(data)
Parameters:

data : str

String with lines separated by ‘

‘.

eof()
is_empty()
peek(n=0)
read()
read_to_condition(condition_func)
read_to_next_empty_line()
read_to_next_unindented_line()
reset()
seek_next_non_empty_line()

dedent_lines

dipy.workflows.docstring_parser.dedent_lines(lines)

Deindent a list of lines maximally

warn

dipy.workflows.docstring_parser.warn()

Issue a warning, or maybe ignore it or raise an exception.

IntrospectiveArgumentParser

class dipy.workflows.flow_runner.IntrospectiveArgumentParser(prog=None, usage=None, description=None, epilog=None, version=None, parents=[], formatter_class=<class 'dipy.fixes.argparse.RawTextHelpFormatter'>, prefix_chars='-', fromfile_prefix_chars=None, argument_default=None, conflict_handler='resolve', add_help=True)

Bases: dipy.fixes.argparse.ArgumentParser

Methods

add_argument(dest, ...[, name, name])
add_argument_group(*args, **kwargs)
add_description()
add_epilogue()
add_mutually_exclusive_group(**kwargs)
add_sub_flow_args(sub_flows) Take an array of workflow objects and use introspection to extract the parameters, types and docstrings of their run method.
add_subparsers(**kwargs)
add_workflow(workflow) Take a workflow object and use introspection to extract the parameters, types and docstrings of its run method.
error(message: string) Prints a usage message incorporating the message to stderr and exits.
exit([status, message])
format_help()
format_usage()
format_version()
get_flow_args([args, namespace]) Returns the parsed arguments as a dictionary that will be used as a workflow’s run method arguments.
get_outputs()
parse_args([args, namespace])
parse_known_args([args, namespace])
print_help([file])
print_usage([file])
print_version([file])
register(registry_name, value, object)
set_defaults(**kwargs)
show_argument(dest)
update_argument(*args, **kargs)
__init__(prog=None, usage=None, description=None, epilog=None, version=None, parents=[], formatter_class=<class 'dipy.fixes.argparse.RawTextHelpFormatter'>, prefix_chars='-', fromfile_prefix_chars=None, argument_default=None, conflict_handler='resolve', add_help=True)

Augmenting the argument parser to allow automatic creation of arguments from workflows

Parameters:

prog : None

The name of the program (default: sys.argv[0])

usage : None

A usage message (default: auto-generated from arguments)

description : str

A description of what the program does

epilog : str

Text following the argument descriptions

version : None

Add a -v/–version option with the given version string

parents : list

Parsers whose arguments should be copied into this one

formatter_class : obj

HelpFormatter class for printing help messages

prefix_chars : str

Characters that prefix optional arguments

fromfile_prefix_chars : None

Characters that prefix files containing additional arguments

argument_default : None

The default value for all arguments

conflict_handler : str

String indicating how to handle conflicts

add_help : bool

Add a -h/-help option

add_description()
add_epilogue()
add_sub_flow_args(sub_flows)

Take an array of workflow objects and use introspection to extract the parameters, types and docstrings of their run method. Only the optional input parameters are extracted for these as they are treated as sub workflows.

Parameters:

sub_flows : array of dipy.workflows.workflow.Workflow

Workflows to inspect.

Returns:

sub_flow_optionals : dictionary of all sub workflow optional parameters

add_workflow(workflow)

Take a workflow object and use introspection to extract the parameters, types and docstrings of its run method. Then add these parameters to the current arparser’s own params to parse. If the workflow is of type combined_workflow, the optional input parameters of its sub workflows will also be added.

Parameters:

workflow : dipy.workflows.workflow.Workflow

Workflow from which to infer parameters.

Returns:

sub_flow_optionals : dictionary of all sub workflow optional parameters

get_flow_args(args=None, namespace=None)

Returns the parsed arguments as a dictionary that will be used as a workflow’s run method arguments.

get_outputs()
show_argument(dest)
update_argument(*args, **kargs)

get_level

dipy.workflows.flow_runner.get_level(lvl)

Transforms the loggin level passed on the commandline into a proper logging level name.

iteritems

dipy.workflows.flow_runner.iteritems(d, **kw)

Return an iterator over the (key, value) pairs of a dictionary.

run_flow

dipy.workflows.flow_runner.run_flow(flow)

Wraps the process of building an argparser that reflects the workflow that we want to run along with some generic parameters like logging, force and output strategies. The resulting parameters are then fed to the workflow’s run method.

IoInfoFlow

class dipy.workflows.io.IoInfoFlow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: dipy.workflows.workflow.Workflow

Methods

get_io_iterator() Create an iterator for IO.
get_short_name()
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run(input_files[, b0_threshold, bvecs_tol, ...]) Provides useful information about different files used in medical imaging.
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

classmethod get_short_name()
run(input_files, b0_threshold=50, bvecs_tol=0.01, bshell_thr=100)

Provides useful information about different files used in medical imaging. Any number of input files can be provided. The program identifies the type of file by its extension.

Parameters:

input_files : variable string

Any number of Nifti1, bvals or bvecs files.

b0_threshold : float, optional

(default 50)

bvecs_tol : float, optional

Threshold used to check that norm(bvec) = 1 +/- bvecs_tol b-vectors are unit vectors (default 0.01)

bshell_thr : float, optional

Threshold for distinguishing b-values in different shells (default 100)

Workflow

class dipy.workflows.io.Workflow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: object

Methods

get_io_iterator() Create an iterator for IO.
get_short_name() A short name for the workflow used to subdivide
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run() Since this is an abstract class, raise exception if this code is
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

get_io_iterator()

Create an iterator for IO.

Use a couple of inspection tricks to build an IOIterator using the previous frame (values of local variables and other contextuals) and the run method’s docstring.

classmethod get_short_name()

A short name for the workflow used to subdivide

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

get_sub_runs()

No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

If it is bound to happen, an action is taken depending on self._force_overwrite (or –force via command line). A log message is output independently of the outcome to tell the user something happened.

run()

Since this is an abstract class, raise exception if this code is reached (not impletemented in child class or literally called on this class)

load_nifti

dipy.workflows.io.load_nifti(fname, return_img=False, return_voxsize=False, return_coords=False)

MaskFlow

class dipy.workflows.mask.MaskFlow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: dipy.workflows.workflow.Workflow

Methods

get_io_iterator() Create an iterator for IO.
get_short_name()
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run(input_files, lb[, ub, out_dir, out_mask]) Workflow for creating a binary mask
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

classmethod get_short_name()
run(input_files, lb, ub=inf, out_dir='', out_mask='mask.nii.gz')

Workflow for creating a binary mask

Parameters:

input_files : string

Path to image to be masked.

lb : float

Lower bound value.

ub : float

Upper bound value (default Inf)

out_dir : string, optional

Output directory (default input file directory)

out_mask : string, optional

Name of the masked file (default ‘mask.nii.gz’)

Workflow

class dipy.workflows.mask.Workflow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: object

Methods

get_io_iterator() Create an iterator for IO.
get_short_name() A short name for the workflow used to subdivide
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run() Since this is an abstract class, raise exception if this code is
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

get_io_iterator()

Create an iterator for IO.

Use a couple of inspection tricks to build an IOIterator using the previous frame (values of local variables and other contextuals) and the run method’s docstring.

classmethod get_short_name()

A short name for the workflow used to subdivide

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

get_sub_runs()

No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

If it is bound to happen, an action is taken depending on self._force_overwrite (or –force via command line). A log message is output independently of the outcome to tell the user something happened.

run()

Since this is an abstract class, raise exception if this code is reached (not impletemented in child class or literally called on this class)

load_nifti

dipy.workflows.mask.load_nifti(fname, return_img=False, return_voxsize=False, return_coords=False)

save_nifti

dipy.workflows.mask.save_nifti(fname, data, affine, hdr=None)

IOIterator

class dipy.workflows.multi_io.IOIterator(output_strategy='append', mix_names=False)

Bases: object

Create output filenames that work nicely with multiple input files from multiple directories (processing multiple subjects with one command)

Use information from input files, out_dir and out_fnames to generate correct outputs which can come from long lists of multiple or single inputs.

Methods

create_directories()
create_outputs()
set_inputs(*args)
set_out_dir(out_dir)
set_out_fnames(*args)
set_output_keys(*args)
__init__(output_strategy='append', mix_names=False)
create_directories()
create_outputs()
set_inputs(*args)
set_out_dir(out_dir)
set_out_fnames(*args)
set_output_keys(*args)

basename_without_extension

dipy.workflows.multi_io.basename_without_extension(fname)

common_start

dipy.workflows.multi_io.common_start(sa, sb)

Returns the longest common substring from the beginning of sa and sb

concatenate_inputs

dipy.workflows.multi_io.concatenate_inputs(multi_inputs)

Concatenate list of inputs

connect_output_paths

dipy.workflows.multi_io.connect_output_paths(inputs, out_dir, out_files, output_strategy='append', mix_names=True)

Generates a list of output files paths based on input files and output strategies.

Parameters:

inputs : array

List of input paths.

out_dir : string

The output directory.

out_files : array

List of output files.

output_strategy : string
Which strategy to use to generate the output paths.

‘append’: Add out_dir to the path of the input. ‘prepend’: Add the input path directory tree to out_dir. ‘absolute’: Put directly in out_dir.

mix_names : bool

Whether or not prepend a string composed of a mix of the input names to the final output name.

Returns:

A list of output file paths.

get_args_default

dipy.workflows.multi_io.get_args_default(func)

glob

dipy.workflows.multi_io.glob(pathname, *, recursive=False)

Return a list of paths matching a pathname pattern.

The pattern may contain simple shell-style wildcards a la fnmatch. However, unlike fnmatch, filenames starting with a dot are special cases that are not matched by ‘*’ and ‘?’ patterns.

If recursive is true, the pattern ‘**’ will match any files and zero or more directories and subdirectories.

io_iterator

dipy.workflows.multi_io.io_iterator(inputs, out_dir, fnames, output_strategy='append', mix_names=False, out_keys=None)

Creates an IOIterator from the parameters.

Parameters:

inputs : array

List of input files.

out_dir : string

Output directory.

fnames : array

File names of all outputs to be created.

output_strategy : string

Controls the behavior of the IOIterator for output paths.

mix_names : bool

Whether or not to append a mix of input names at the beginning.

Returns

——-

Properly instantiated IOIterator object.

io_iterator_

dipy.workflows.multi_io.io_iterator_(frame, fnc, output_strategy='append', mix_names=False)

Creates an IOIterator using introspection.

Parameters:

frame : frameobject

Contains the info about the current local variables values.

fnc : function

The function to inspect

output_strategy : string

Controls the behavior of the IOIterator for output paths.

mix_names : bool

Whether or not to append a mix of input names at the beginning.

Returns

——-

Properly instantiated IOIterator object.

slash_to_under

dipy.workflows.multi_io.slash_to_under(dir_str)

ConstrainedSphericalDeconvModel

class dipy.workflows.reconst.ConstrainedSphericalDeconvModel(gtab, response, reg_sphere=None, sh_order=8, lambda_=1, tau=0.1)

Bases: dipy.reconst.shm.SphHarmModel

Methods

cache_clear() Clear the cache.
cache_get(tag, key[, default]) Retrieve a value from the cache.
cache_set(tag, key, value) Store a value in the cache.
fit(data[, mask]) Fit method for every voxel in data
predict(sh_coeff[, gtab, S0]) Compute a signal prediction given spherical harmonic coefficients for the provided GradientTable class instance.
sampling_matrix(sphere) The matrix needed to sample ODFs from coefficients of the model.
__init__(gtab, response, reg_sphere=None, sh_order=8, lambda_=1, tau=0.1)

Constrained Spherical Deconvolution (CSD) [R508].

Spherical deconvolution computes a fiber orientation distribution (FOD), also called fiber ODF (fODF) [R509], as opposed to a diffusion ODF as the QballModel or the CsaOdfModel. This results in a sharper angular profile with better angular resolution that is the best object to be used for later deterministic and probabilistic tractography [R510].

A sharp fODF is obtained because a single fiber response function is injected as a priori knowledge. The response function is often data-driven and is thus provided as input to the ConstrainedSphericalDeconvModel. It will be used as deconvolution kernel, as described in [R508].

Parameters:

gtab : GradientTable

response : tuple or AxSymShResponse object

A tuple with two elements. The first is the eigen-values as an (3,) ndarray and the second is the signal value for the response function without diffusion weighting. This is to be able to generate a single fiber synthetic signal. The response function will be used as deconvolution kernel ([R508])

reg_sphere : Sphere (optional)

sphere used to build the regularization B matrix. Default: ‘symmetric362’.

sh_order : int (optional)

maximal spherical harmonics order. Default: 8

lambda_ : float (optional)

weight given to the constrained-positivity regularization part of the deconvolution equation (see [R508]). Default: 1

tau : float (optional)

threshold controlling the amplitude below which the corresponding fODF is assumed to be zero. Ideally, tau should be set to zero. However, to improve the stability of the algorithm, tau is set to tau*100 % of the mean fODF amplitude (here, 10% by default) (see [R508]). Default: 0.1

References

[R508](1, 2, 3, 4, 5, 6) Tournier, J.D., et al. NeuroImage 2007. Robust determination of the fibre orientation distribution in diffusion MRI: Non-negativity constrained super-resolved spherical deconvolution
[R509](1, 2) Descoteaux, M., et al. IEEE TMI 2009. Deterministic and Probabilistic Tractography Based on Complex Fibre Orientation Distributions
[R510](1, 2) C^ot’e, M-A., et al. Medical Image Analysis 2013. Tractometer: Towards validation of tractography pipelines
[R511]Tournier, J.D, et al. Imaging Systems and Technology 2012. MRtrix: Diffusion Tractography in Crossing Fiber Regions
fit(data, mask=None)

Fit method for every voxel in data

predict(sh_coeff, gtab=None, S0=1.0)

Compute a signal prediction given spherical harmonic coefficients for the provided GradientTable class instance.

Parameters:

sh_coeff : ndarray

The spherical harmonic representation of the FOD from which to make the signal prediction.

gtab : GradientTable

The gradients for which the signal will be predicted. Use the model’s gradient table by default.

S0 : ndarray or float

The non diffusion-weighted signal value.

Returns:

pred_sig : ndarray

The predicted signal.

CsaOdfModel

class dipy.workflows.reconst.CsaOdfModel(gtab, sh_order, smooth=0.006, min_signal=1.0, assume_normed=False)

Bases: dipy.reconst.shm.QballBaseModel

Implementation of Constant Solid Angle reconstruction method.

References

[R512]Aganj, I., et al. 2009. ODF Reconstruction in Q-Ball Imaging With Solid Angle Consideration.

Methods

cache_clear() Clear the cache.
cache_get(tag, key[, default]) Retrieve a value from the cache.
cache_set(tag, key, value) Store a value in the cache.
fit(data[, mask]) Fits the model to diffusion data and returns the model fit
sampling_matrix(sphere) The matrix needed to sample ODFs from coefficients of the model.
__init__(gtab, sh_order, smooth=0.006, min_signal=1.0, assume_normed=False)

Creates a model that can be used to fit or sample diffusion data

See also

normalize_data

max = 0.999
min = 0.001

DiffusionKurtosisModel

class dipy.workflows.reconst.DiffusionKurtosisModel(gtab, fit_method='WLS', *args, **kwargs)

Bases: dipy.reconst.base.ReconstModel

Class for the Diffusion Kurtosis Model

Methods

fit(data[, mask]) Fit method of the DKI model class
predict(dki_params[, S0]) Predict a signal for this DKI model class instance given parameters.
__init__(gtab, fit_method='WLS', *args, **kwargs)

Diffusion Kurtosis Tensor Model [1]

Parameters:

gtab : GradientTable class instance

fit_method : str or callable

str can be one of the following: ‘OLS’ or ‘ULLS’ for ordinary least squares

dki.ols_fit_dki

‘WLS’ or ‘UWLLS’ for weighted ordinary least squares

dki.wls_fit_dki

callable has to have the signature:

fit_method(design_matrix, data, *args, **kwargs)

args, kwargs : arguments and key-word arguments passed to the

fit_method. See dki.ols_fit_dki, dki.wls_fit_dki for details

References

[R513]Tabesh, A., Jensen, J.H., Ardekani, B.A., Helpern, J.A., 2011.

Estimation of tensors and tensor-derived measures in diffusional kurtosis imaging. Magn Reson Med. 65(3), 823-836

fit(data, mask=None)

Fit method of the DKI model class

Parameters:

data : array

The measured signal from one voxel.

mask : array

A boolean array used to mark the coordinates in the data that should be analyzed that has the shape data.shape[-1]

predict(dki_params, S0=1.0)

Predict a signal for this DKI model class instance given parameters.

Parameters:

dki_params : ndarray (x, y, z, 27) or (n, 27)

All parameters estimated from the diffusion kurtosis model. Parameters are ordered as follows:

  1. Three diffusion tensor’s eigenvalues
  2. Three lines of the eigenvector matrix each containing the first, second and third coordinates of the eigenvector
  3. Fifteen elements of the kurtosis tensor

S0 : float or ndarray (optional)

The non diffusion-weighted signal in every voxel, or across all voxels. Default: 1

ReconstCSAFlow

class dipy.workflows.reconst.ReconstCSAFlow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: dipy.workflows.workflow.Workflow

Methods

get_io_iterator() Create an iterator for IO.
get_short_name()
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run(input_files, bvalues, bvectors, mask_files) Constant Solid Angle.
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

classmethod get_short_name()
run(input_files, bvalues, bvectors, mask_files, sh_order=6, odf_to_sh_order=8, b0_threshold=0.0, bvecs_tol=0.01, extract_pam_values=False, out_dir='', out_pam='peaks.pam5', out_shm='shm.nii.gz', out_peaks_dir='peaks_dirs.nii.gz', out_peaks_values='peaks_values.nii.gz', out_peaks_indices='peaks_indices.nii.gz', out_gfa='gfa.nii.gz')

Constant Solid Angle.

Parameters:

input_files : string

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues : string

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors : string

Path to the bvectors files. This path may contain wildcards to use multiple bvectors files at once.

mask_files : string

Path to the input masks. This path may contain wildcards to use multiple masks at once. (default: No mask used)

sh_order : int, optional

Spherical harmonics order (default 6) used in the CSA fit.

odf_to_sh_order : int, optional

Spherical harmonics order used for peak_from_model to compress the ODF to spherical harmonics coefficients (default 8)

b0_threshold : float, optional

Threshold used to find b=0 directions

bvecs_tol : float, optional

Threshold used so that norm(bvec)=1 (default 0.01)

extract_pam_values : bool, optional

Wheter or not to save pam volumes as single nifti files.

out_dir : string, optional

Output directory (default input file directory)

out_pam : string, optional

Name of the peaks volume to be saved (default ‘peaks.pam5’)

out_shm : string, optional

Name of the shperical harmonics volume to be saved (default ‘shm.nii.gz’)

out_peaks_dir : string, optional

Name of the peaks directions volume to be saved (default ‘peaks_dirs.nii.gz’)

out_peaks_values : string, optional

Name of the peaks values volume to be saved (default ‘peaks_values.nii.gz’)

out_peaks_indices : string, optional

Name of the peaks indices volume to be saved (default ‘peaks_indices.nii.gz’)

out_gfa : string, optional

Name of the generalise fa volume to be saved (default ‘gfa.nii.gz’)

References

[R514]Aganj, I., et al. 2009. ODF Reconstruction in Q-Ball Imaging with Solid Angle Consideration.

ReconstCSDFlow

class dipy.workflows.reconst.ReconstCSDFlow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: dipy.workflows.workflow.Workflow

Methods

get_io_iterator() Create an iterator for IO.
get_short_name()
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run(input_files, bvalues, bvectors, mask_files) Constrained spherical deconvolution
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

classmethod get_short_name()
run(input_files, bvalues, bvectors, mask_files, b0_threshold=0.0, bvecs_tol=0.01, roi_center=None, roi_radius=10, fa_thr=0.7, frf=None, extract_pam_values=False, sh_order=8, odf_to_sh_order=8, out_dir='', out_pam='peaks.pam5', out_shm='shm.nii.gz', out_peaks_dir='peaks_dirs.nii.gz', out_peaks_values='peaks_values.nii.gz', out_peaks_indices='peaks_indices.nii.gz', out_gfa='gfa.nii.gz')

Constrained spherical deconvolution

Parameters:

input_files : string

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues : string

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors : string

Path to the bvectors files. This path may contain wildcards to use multiple bvectors files at once.

mask_files : string

Path to the input masks. This path may contain wildcards to use multiple masks at once. (default: No mask used)

b0_threshold : float, optional

Threshold used to find b=0 directions

bvecs_tol : float, optional

Bvecs should be unit vectors. (default:0.01)

roi_center : variable int, optional

Center of ROI in data. If center is None, it is assumed that it is the center of the volume with shape data.shape[:3] (default None)

roi_radius : int, optional

radius of cubic ROI in voxels (default 10)

fa_thr : float, optional

FA threshold for calculating the response function (default 0.7)

frf : variable float, optional

Fiber response function can be for example inputed as 15 4 4 (from the command line) or [15, 4, 4] from a Python script to be converted to float and mutiplied by 10**-4 . If None the fiber response function will be computed automatically (default: None).

extract_pam_values : bool, optional

Save or not to save pam volumes as single nifti files.

sh_order : int, optional

Spherical harmonics order (default 6) used in the CSA fit.

odf_to_sh_order : int, optional

Spherical harmonics order used for peak_from_model to compress the ODF to spherical harmonics coefficients (default 8)

out_dir : string, optional

Output directory (default input file directory)

out_pam : string, optional

Name of the peaks volume to be saved (default ‘peaks.pam5’)

out_shm : string, optional

Name of the shperical harmonics volume to be saved (default ‘shm.nii.gz’)

out_peaks_dir : string, optional

Name of the peaks directions volume to be saved (default ‘peaks_dirs.nii.gz’)

out_peaks_values : string, optional

Name of the peaks values volume to be saved (default ‘peaks_values.nii.gz’)

out_peaks_indices : string, optional

Name of the peaks indices volume to be saved (default ‘peaks_indices.nii.gz’)

out_gfa : string, optional

Name of the generalise fa volume to be saved (default ‘gfa.nii.gz’)

References

[R516]Tournier, J.D., et al. NeuroImage 2007. Robust determination of the fibre orientation distribution in diffusion MRI: Non-negativity constrained super-resolved spherical deconvolution.

ReconstDkiFlow

class dipy.workflows.reconst.ReconstDkiFlow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: dipy.workflows.workflow.Workflow

Methods

get_dki_model(gtab)
get_fitted_tensor(data, mask, bval, bvec[, ...])
get_io_iterator() Create an iterator for IO.
get_short_name()
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run(input_files, bvalues, bvectors, mask_files) Workflow for Diffusion Kurtosis reconstruction and for computing DKI metrics.
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

get_dki_model(gtab)
get_fitted_tensor(data, mask, bval, bvec, b0_threshold=0)
classmethod get_short_name()
run(input_files, bvalues, bvectors, mask_files, b0_threshold=0.0, save_metrics=[], out_dir='', out_dt_tensor='dti_tensors.nii.gz', out_fa='fa.nii.gz', out_ga='ga.nii.gz', out_rgb='rgb.nii.gz', out_md='md.nii.gz', out_ad='ad.nii.gz', out_rd='rd.nii.gz', out_mode='mode.nii.gz', out_evec='evecs.nii.gz', out_eval='evals.nii.gz', out_dk_tensor='dki_tensors.nii.gz', out_mk='mk.nii.gz', out_ak='ak.nii.gz', out_rk='rk.nii.gz')

Workflow for Diffusion Kurtosis reconstruction and for computing DKI metrics. Performs a DKI reconstruction on the files by ‘globing’ input_files and saves the DTI metrics in a directory specified by out_dir.

Parameters:

input_files : string

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues : string

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors : string

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

mask_files : string

Path to the input masks. This path may contain wildcards to use multiple masks at once. (default: No mask used)

b0_threshold : float, optional

Threshold used to find b=0 directions (default 0.0)

save_metrics : variable string, optional

List of metrics to save. Possible values: fa, ga, rgb, md, ad, rd, mode, tensor, evec, eval (default [] (all))

out_dir : string, optional

Output directory (default input file directory)

out_dt_tensor : string, optional

Name of the tensors volume to be saved (default: ‘dti_tensors.nii.gz’)

out_dk_tensor : string, optional

Name of the tensors volume to be saved (default ‘dki_tensors.nii.gz’)

out_fa : string, optional

Name of the fractional anisotropy volume to be saved (default ‘fa.nii.gz’)

out_ga : string, optional

Name of the geodesic anisotropy volume to be saved (default ‘ga.nii.gz’)

out_rgb : string, optional

Name of the color fa volume to be saved (default ‘rgb.nii.gz’)

out_md : string, optional

Name of the mean diffusivity volume to be saved (default ‘md.nii.gz’)

out_ad : string, optional

Name of the axial diffusivity volume to be saved (default ‘ad.nii.gz’)

out_rd : string, optional

Name of the radial diffusivity volume to be saved (default ‘rd.nii.gz’)

out_mode : string, optional

Name of the mode volume to be saved (default ‘mode.nii.gz’)

out_evec : string, optional

Name of the eigenvectors volume to be saved (default ‘evecs.nii.gz’)

out_eval : string, optional

Name of the eigenvalues to be saved (default ‘evals.nii.gz’)

out_mk : string, optional

Name of the mean kurtosis to be saved (default: ‘mk.nii.gz’)

out_ak : string, optional

Name of the axial kurtosis to be saved (default: ‘ak.nii.gz’)

out_rk : string, optional

Name of the radial kurtosis to be saved (default: ‘rk.nii.gz’)

References

[R518]Tabesh, A., Jensen, J.H., Ardekani, B.A., Helpern, J.A., 2011. Estimation of tensors and tensor-derived measures in diffusional kurtosis imaging. Magn Reson Med. 65(3), 823-836
[R519]Jensen, Jens H., Joseph A. Helpern, Anita Ramani, Hanzhang Lu, and Kyle Kaczynski. 2005. Diffusional Kurtosis Imaging: The Quantification of Non-Gaussian Water Diffusion by Means of Magnetic Resonance Imaging. MRM 53 (6):1432-40.

ReconstDtiFlow

class dipy.workflows.reconst.ReconstDtiFlow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: dipy.workflows.workflow.Workflow

Methods

get_fitted_tensor(data, mask, bval, bvec[, ...])
get_io_iterator() Create an iterator for IO.
get_short_name()
get_sub_runs() No sub runs since this is a simple workflow.
get_tensor_model(gtab)
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run(input_files, bvalues, bvectors, mask_files) Workflow for tensor reconstruction and for computing DTI metrics.
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

get_fitted_tensor(data, mask, bval, bvec, b0_threshold=0, bvecs_tol=0.01)
classmethod get_short_name()
get_tensor_model(gtab)
run(input_files, bvalues, bvectors, mask_files, b0_threshold=0.0, bvecs_tol=0.01, save_metrics=[], out_dir='', out_tensor='tensors.nii.gz', out_fa='fa.nii.gz', out_ga='ga.nii.gz', out_rgb='rgb.nii.gz', out_md='md.nii.gz', out_ad='ad.nii.gz', out_rd='rd.nii.gz', out_mode='mode.nii.gz', out_evec='evecs.nii.gz', out_eval='evals.nii.gz')

Workflow for tensor reconstruction and for computing DTI metrics. using Weighted Least-Squares. Performs a tensor reconstruction on the files by ‘globing’ input_files and saves the DTI metrics in a directory specified by out_dir.

Parameters:

input_files : string

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

bvalues : string

Path to the bvalues files. This path may contain wildcards to use multiple bvalues files at once.

bvectors : string

Path to the bvectors files. This path may contain wildcards to use multiple bvectors files at once.

mask_files : string

Path to the input masks. This path may contain wildcards to use multiple masks at once. (default: No mask used)

b0_threshold : float, optional

Threshold used to find b=0 directions (default 0.0)

bvecs_tol : float, optional

Threshold used to check that norm(bvec) = 1 +/- bvecs_tol b-vectors are unit vectors (default 0.01)

save_metrics : variable string, optional

List of metrics to save. Possible values: fa, ga, rgb, md, ad, rd, mode, tensor, evec, eval (default [] (all))

out_dir : string, optional

Output directory (default input file directory)

out_tensor : string, optional

Name of the tensors volume to be saved (default ‘tensors.nii.gz’)

out_fa : string, optional

Name of the fractional anisotropy volume to be saved (default ‘fa.nii.gz’)

out_ga : string, optional

Name of the geodesic anisotropy volume to be saved (default ‘ga.nii.gz’)

out_rgb : string, optional

Name of the color fa volume to be saved (default ‘rgb.nii.gz’)

out_md : string, optional

Name of the mean diffusivity volume to be saved (default ‘md.nii.gz’)

out_ad : string, optional

Name of the axial diffusivity volume to be saved (default ‘ad.nii.gz’)

out_rd : string, optional

Name of the radial diffusivity volume to be saved (default ‘rd.nii.gz’)

out_mode : string, optional

Name of the mode volume to be saved (default ‘mode.nii.gz’)

out_evec : string, optional

Name of the eigenvectors volume to be saved (default ‘evecs.nii.gz’)

out_eval : string, optional

Name of the eigenvalues to be saved (default ‘evals.nii.gz’)

References

[R522]Basser, P.J., Mattiello, J., LeBihan, D., 1994. Estimation of the effective self-diffusion tensor from the NMR spin echo. J Magn Reson B 103, 247-254.
[R523]Basser, P., Pierpaoli, C., 1996. Microstructural and physiological features of tissues elucidated by quantitative diffusion-tensor MRI. Journal of Magnetic Resonance 111, 209-219.
[R524]Lin-Ching C., Jones D.K., Pierpaoli, C. 2005. RESTORE: Robust estimation of tensors by outlier rejection. MRM 53: 1088-1095
[R525]hung, SW., Lu, Y., Henry, R.G., 2006. Comparison of bootstrap approaches for estimation of uncertainties of DTI parameters. NeuroImage 33, 531-541.

ReconstMAPMRIFlow

class dipy.workflows.reconst.ReconstMAPMRIFlow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: dipy.workflows.workflow.Workflow

Methods

get_io_iterator() Create an iterator for IO.
get_short_name()
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run(data_file, data_bvals, data_bvecs, ...) Workflow for fitting the MAPMRI model (with optional Laplacian regularization).
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

classmethod get_short_name()
run(data_file, data_bvals, data_bvecs, small_delta, big_delta, b0_threshold=0.0, laplacian=True, positivity=True, bval_threshold=2000, save_metrics=[], laplacian_weighting=0.05, radial_order=6, out_dir='', out_rtop='rtop.nii.gz', out_lapnorm='lapnorm.nii.gz', out_msd='msd.nii.gz', out_qiv='qiv.nii.gz', out_rtap='rtap.nii.gz', out_rtpp='rtpp.nii.gz', out_ng='ng.nii.gz', out_perng='perng.nii.gz', out_parng='parng.nii.gz')

Workflow for fitting the MAPMRI model (with optional Laplacian regularization). Generates rtop, lapnorm, msd, qiv, rtap, rtpp, non-gaussian (ng), parallel ng, perpendicular ng saved in a nifti format in input files provided by data_file and saves the nifti files to an output directory specified by out_dir.

In order for the MAPMRI workflow to work in the way intended either the laplacian or positivity or both must be set to True.

Parameters:

data_file : string

Path to the input volume.

data_bvals : string

Path to the bval files.

data_bvecs : string

Path to the bvec files.

small_delta : float

Small delta value used in generation of gradient table of provided bval and bvec.

big_delta : float

Big delta value used in generation of gradient table of provided bval and bvec.

b0_threshold : float, optional

Threshold used to find b=0 directions (default 0.0)

laplacian : bool

Regularize using the Laplacian of the MAP-MRI basis (default True)

positivity : bool

Constrain the propagator to be positive. (default True)

bval_threshold : float

Sets the b-value threshold to be used in the scale factor estimation. In order for the estimated non-Gaussianity to have meaning this value should set to a lower value (b<2000 s/mm^2) such that the scale factors are estimated on signal points that reasonably represent the spins at Gaussian diffusion. (default: 2000)

save_metrics : list of strings

List of metrics to save. Possible values: rtop, laplacian_signal, msd, qiv, rtap, rtpp, ng, perng, parng (default: [] (all))

laplacian_weighting : float

Weighting value used in fitting the MAPMRI model in the laplacian and both model types. (default: 0.05)

radial_order : unsigned int

Even value used to set the order of the basis (default: 6)

out_dir : string, optional

Output directory (default: input file directory)

out_rtop : string, optional

Name of the rtop to be saved

out_lapnorm : string, optional

Name of the norm of laplacian signal to be saved

out_msd : string, optional

Name of the msd to be saved

out_qiv : string, optional

Name of the qiv to be saved

out_rtap : string, optional

Name of the rtap to be saved

out_rtpp : string, optional

Name of the rtpp to be saved

out_ng : string, optional

Name of the Non-Gaussianity to be saved

out_perng : string, optional

Name of the Non-Gaussianity perpendicular to be saved

out_parng : string, optional

Name of the Non-Gaussianity parallel to be saved

TensorModel

class dipy.workflows.reconst.TensorModel(gtab, fit_method='WLS', return_S0_hat=False, *args, **kwargs)

Bases: dipy.reconst.base.ReconstModel

Diffusion Tensor

Methods

fit(data[, mask]) Fit method of the DTI model class
predict(dti_params[, S0]) Predict a signal for this TensorModel class instance given parameters.
__init__(gtab, fit_method='WLS', return_S0_hat=False, *args, **kwargs)

A Diffusion Tensor Model [R530], [R531].

Parameters:

gtab : GradientTable class instance

fit_method : str or callable

str can be one of the following:

‘WLS’ for weighted least squares

dti.wls_fit_tensor()

‘LS’ or ‘OLS’ for ordinary least squares

dti.ols_fit_tensor()

‘NLLS’ for non-linear least-squares

dti.nlls_fit_tensor()

‘RT’ or ‘restore’ or ‘RESTORE’ for RESTORE robust tensor

fitting [R532] dti.restore_fit_tensor()

callable has to have the signature:

fit_method(design_matrix, data, *args, **kwargs)

return_S0_hat : bool

Boolean to return (True) or not (False) the S0 values for the fit.

args, kwargs : arguments and key-word arguments passed to the

fit_method. See dti.wls_fit_tensor, dti.ols_fit_tensor for details

min_signal : float

The minimum signal value. Needs to be a strictly positive number. Default: minimal signal in the data provided to fit.

Notes

In order to increase speed of processing, tensor fitting is done simultaneously over many voxels. Many fit_methods use the ‘step’ parameter to set the number of voxels that will be fit at once in each iteration. This is the chunk size as a number of voxels. A larger step value should speed things up, but it will also take up more memory. It is advisable to keep an eye on memory consumption as this value is increased.

Example : In iter_fit_tensor() we have a default step value of 1e4

References

[R530](1, 2) Basser, P.J., Mattiello, J., LeBihan, D., 1994. Estimation of the effective self-diffusion tensor from the NMR spin echo. J Magn Reson B 103, 247-254.
[R531](1, 2) Basser, P., Pierpaoli, C., 1996. Microstructural and physiological features of tissues elucidated by quantitative diffusion-tensor MRI. Journal of Magnetic Resonance 111, 209-219.
[R532](1, 2) Lin-Ching C., Jones D.K., Pierpaoli, C. 2005. RESTORE: Robust estimation of tensors by outlier rejection. MRM 53: 1088-1095
fit(data, mask=None)

Fit method of the DTI model class

Parameters:

data : array

The measured signal from one voxel.

mask : array

A boolean array used to mark the coordinates in the data that should be analyzed that has the shape data.shape[:-1]

predict(dti_params, S0=1.0)

Predict a signal for this TensorModel class instance given parameters.

Parameters:

dti_params : ndarray

The last dimension should have 12 tensor parameters: 3 eigenvalues, followed by the 3 eigenvectors

S0 : float or ndarray

The non diffusion-weighted signal in every voxel, or across all voxels. Default: 1

Workflow

class dipy.workflows.reconst.Workflow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: object

Methods

get_io_iterator() Create an iterator for IO.
get_short_name() A short name for the workflow used to subdivide
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run() Since this is an abstract class, raise exception if this code is
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

get_io_iterator()

Create an iterator for IO.

Use a couple of inspection tricks to build an IOIterator using the previous frame (values of local variables and other contextuals) and the run method’s docstring.

classmethod get_short_name()

A short name for the workflow used to subdivide

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

get_sub_runs()

No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

If it is bound to happen, an action is taken depending on self._force_overwrite (or –force via command line). A log message is output independently of the outcome to tell the user something happened.

run()

Since this is an abstract class, raise exception if this code is reached (not impletemented in child class or literally called on this class)

auto_response

dipy.workflows.reconst.auto_response(gtab, data, roi_center=None, roi_radius=10, fa_thr=0.7, fa_callable=<function fa_superior>, return_number_of_voxels=False)

Automatic estimation of response function using FA.

Parameters:

gtab : GradientTable

data : ndarray

diffusion data

roi_center : tuple, (3,)

Center of ROI in data. If center is None, it is assumed that it is the center of the volume with shape data.shape[:3].

roi_radius : int

radius of cubic ROI

fa_thr : float

FA threshold

fa_callable : callable

A callable that defines an operation that compares FA with the fa_thr. The operator should have two positional arguments (e.g., fa_operator(FA, fa_thr)) and it should return a bool array.

return_number_of_voxels : bool

If True, returns the number of voxels used for estimating the response function.

Returns:

response : tuple, (2,)

(evals, S0)

ratio : float

The ratio between smallest versus largest eigenvalue of the response.

number of voxels : int (optional)

The number of voxels used for estimating the response function.

Notes

In CSD there is an important pre-processing step: the estimation of the fiber response function. In order to do this we look for voxels with very anisotropic configurations. For example we can use an ROI (20x20x20) at the center of the volume and store the signal values for the voxels with FA values higher than 0.7. Of course, if we haven’t precalculated FA we need to fit a Tensor model to the datasets. Which is what we do in this function.

For the response we also need to find the average S0 in the ROI. This is possible using gtab.b0s_mask() we can find all the S0 volumes (which correspond to b-values equal 0) in the dataset.

The response consists always of a prolate tensor created by averaging the highest and second highest eigenvalues in the ROI with FA higher than threshold. We also include the average S0s.

We also return the ratio which is used for the SDT models. If requested, the number of voxels used for estimating the response function is also returned, which can be used to judge the fidelity of the response function. As a rule of thumb, at least 300 voxels should be used to estimate a good response function (see [R533]).

References

[R533](1, 2) Tournier, J.D., et al. NeuroImage 2004. Direct estimation of the

fiber orientation density function from diffusion-weighted MRI data using spherical deconvolution

axial_diffusivity

dipy.workflows.reconst.axial_diffusivity(evals, axis=-1)

Axial Diffusivity (AD) of a diffusion tensor. Also called parallel diffusivity.

Parameters:

evals : array-like

Eigenvalues of a diffusion tensor, must be sorted in descending order along axis.

axis : int

Axis of evals which contains 3 eigenvalues.

Returns:

ad : array

Calculated AD.

Notes

AD is calculated with the following equation:

\[AD = \lambda_1\]

color_fa

dipy.workflows.reconst.color_fa(fa, evecs)

Color fractional anisotropy of diffusion tensor

Parameters:

fa : array-like

Array of the fractional anisotropy (can be 1D, 2D or 3D)

evecs : array-like

eigen vectors from the tensor model

Returns:

rgb : Array with 3 channels for each color as the last dimension.

Colormap of the FA with red for the x value, y for the green value and z for the blue value.

ec{e})) imes fa

fractional_anisotropy

dipy.workflows.reconst.fractional_anisotropy(evals, axis=-1)
Fractional anisotropy (FA) of a diffusion tensor.
Parameters:

evals : array-like

Eigenvalues of a diffusion tensor.

axis : int

Axis of evals which contains 3 eigenvalues.

Returns:

fa : array

Calculated FA. Range is 0 <= FA <= 1.

rac{1}{2} rac{(lambda_1-lambda_2)^2+(lambda_1-

lambda_3)^2+(lambda_2-lambda_3)^2}{lambda_1^2+ lambda_2^2+lambda_3^2}}

geodesic_anisotropy

dipy.workflows.reconst.geodesic_anisotropy(evals, axis=-1)
Geodesic anisotropy (GA) of a diffusion tensor.
Parameters:

evals : array-like

Eigenvalues of a diffusion tensor.

axis : int

Axis of evals which contains 3 eigenvalues.

Returns:

ga : array

Calculated GA. In the range 0 to +infinity

ight )}},
quad extrm{where} quad <mathbf{D}> = (lambda_1lambda_2lambda_3)^{1/3}

Note that the notation, \(<D>\), is often used as the mean diffusivity (MD) of the diffusion tensor and can lead to confusions in the literature (see [R534] versus [R535] versus [R536] for example). Reference [R535] defines geodesic anisotropy (GA) with \(<D>\) as the MD in the denominator of the sum. This is wrong. The original paper [R534] defines GA with \(<D> = det(D)^{1/3}\), as the isotropic part of the distance. This might be an explanation for the confusion. The isotropic part of the diffusion tensor in Euclidean space is the MD whereas the isotropic part of the tensor in log-Euclidean space is \(det(D)^{1/3}\). The Appendix of [R534] and log-Euclidean derivations from [R536] are clear on this. Hence, all that to say that \(<D> = det(D)^{1/3}\) here for the GA definition and not MD.

get_mode

dipy.workflows.reconst.get_mode(q_form)

Mode (MO) of a diffusion tensor [R538].

Parameters:

q_form : ndarray

The quadratic form of a tensor, or an array with quadratic forms of tensors. Should be of shape (x, y, z, 3, 3) or (n, 3, 3) or (3, 3).

Returns:

mode : array

Calculated tensor mode in each spatial coordinate.

Notes

Mode ranges between -1 (planar anisotropy) and +1 (linear anisotropy) with 0 representing orthotropy. Mode is calculated with the following equation (equation 9 in [R538]):

\[Mode = 3*\sqrt{6}*det(\widetilde{A}/norm(\widetilde{A}))\]

Where \(\widetilde{A}\) is the deviatoric part of the tensor quadratic form.

References

[R538](1, 2, 3) Daniel B. Ennis and G. Kindlmann, “Orthogonal Tensor Invariants and the Analysis of Diffusion Tensor Magnetic Resonance Images”, Magnetic Resonance in Medicine, vol. 55, no. 1, pp. 136-146, 2006.

get_sphere

dipy.workflows.reconst.get_sphere(name='symmetric362')

provide triangulated spheres

Parameters:

name : str

which sphere - one of: * ‘symmetric362’ * ‘symmetric642’ * ‘symmetric724’ * ‘repulsion724’ * ‘repulsion100’ * ‘repulsion200’

Returns:

sphere : a dipy.core.sphere.Sphere class instance

Examples

>>> import numpy as np
>>> from dipy.data import get_sphere
>>> sphere = get_sphere('symmetric362')
>>> verts, faces = sphere.vertices, sphere.faces
>>> verts.shape == (362, 3)
True
>>> faces.shape == (720, 3)
True
>>> verts, faces = get_sphere('not a sphere name') 
Traceback (most recent call last):
    ...
DataError: No sphere called "not a sphere name"

gradient_table

dipy.workflows.reconst.gradient_table(bvals, bvecs=None, big_delta=None, small_delta=None, b0_threshold=0, atol=0.01)

A general function for creating diffusion MR gradients.

It reads, loads and prepares scanner parameters like the b-values and b-vectors so that they can be useful during the reconstruction process.

Parameters:

bvals : can be any of the four options

  1. an array of shape (N,) or (1, N) or (N, 1) with the b-values.
  2. a path for the file which contains an array like the above (1).
  3. an array of shape (N, 4) or (4, N). Then this parameter is considered to be a b-table which contains both bvals and bvecs. In this case the next parameter is skipped.
  4. a path for the file which contains an array like the one at (3).

bvecs : can be any of two options

  1. an array of shape (N, 3) or (3, N) with the b-vectors.
  2. a path for the file which contains an array like the previous.

big_delta : float

acquisition timing duration (default None)

small_delta : float

acquisition timing duration (default None)

b0_threshold : float

All b-values with values less than or equal to bo_threshold are considered as b0s i.e. without diffusion weighting.

atol : float

All b-vectors need to be unit vectors up to a tolerance.

Returns:

gradients : GradientTable

A GradientTable with all the gradient information.

Notes

  1. Often b0s (b-values which correspond to images without diffusion weighting) have 0 values however in some cases the scanner cannot provide b0s of an exact 0 value and it gives a bit higher values e.g. 6 or 12. This is the purpose of the b0_threshold in the __init__.
  2. We assume that the minimum number of b-values is 7.
  3. B-vectors should be unit vectors.

Examples

>>> from dipy.core.gradients import gradient_table
>>> bvals=1500*np.ones(7)
>>> bvals[0]=0
>>> sq2=np.sqrt(2)/2
>>> bvecs=np.array([[0, 0, 0],
...                 [1, 0, 0],
...                 [0, 1, 0],
...                 [0, 0, 1],
...                 [sq2, sq2, 0],
...                 [sq2, 0, sq2],
...                 [0, sq2, sq2]])
>>> gt = gradient_table(bvals, bvecs)
>>> gt.bvecs.shape == bvecs.shape
True
>>> gt = gradient_table(bvals, bvecs.T)
>>> gt.bvecs.shape == bvecs.T.shape
False

literal_eval

dipy.workflows.reconst.literal_eval(node_or_string)

Safely evaluate an expression node or a string containing a Python expression. The string or node provided may only consist of the following Python literal structures: strings, bytes, numbers, tuples, lists, dicts, sets, booleans, and None.

lower_triangular

dipy.workflows.reconst.lower_triangular(tensor, b0=None)

Returns the six lower triangular values of the tensor and a dummy variable if b0 is not None

Parameters:

tensor : array_like (..., 3, 3)

a collection of 3, 3 diffusion tensors

b0 : float

if b0 is not none log(b0) is returned as the dummy variable

Returns:

D : ndarray

If b0 is none, then the shape will be (..., 6) otherwise (..., 7)

mean_diffusivity

dipy.workflows.reconst.mean_diffusivity(evals, axis=-1)
Mean Diffusivity (MD) of a diffusion tensor.
Parameters:

evals : array-like

Eigenvalues of a diffusion tensor.

axis : int

Axis of evals which contains 3 eigenvalues.

Returns:

md : array

Calculated MD.

rac{lambda_1 + lambda_2 + lambda_3}{3}

peaks_from_model

dipy.workflows.reconst.peaks_from_model(model, data, sphere, relative_peak_threshold, min_separation_angle, mask=None, return_odf=False, return_sh=True, gfa_thr=0, normalize_peaks=False, sh_order=8, sh_basis_type=None, npeaks=5, B=None, invB=None, parallel=False, nbr_processes=None)

Fit the model to data and computes peaks and metrics

Parameters:

model : a model instance

model will be used to fit the data.

sphere : Sphere

The Sphere providing discrete directions for evaluation.

relative_peak_threshold : float

Only return peaks greater than relative_peak_threshold * m where m is the largest peak.

min_separation_angle : float in [0, 90] The minimum distance between

directions. If two peaks are too close only the larger of the two is returned.

mask : array, optional

If mask is provided, voxels that are False in mask are skipped and no peaks are returned.

return_odf : bool

If True, the odfs are returned.

return_sh : bool

If True, the odf as spherical harmonics coefficients is returned

gfa_thr : float

Voxels with gfa less than gfa_thr are skipped, no peaks are returned.

normalize_peaks : bool

If true, all peak values are calculated relative to max(odf).

sh_order : int, optional

Maximum SH order in the SH fit. For sh_order, there will be (sh_order + 1) * (sh_order + 2) / 2 SH coefficients (default 8).

sh_basis_type : {None, ‘mrtrix’, ‘fibernav’}

None for the default dipy basis which is the fibernav basis, mrtrix for the MRtrix basis, and fibernav for the FiberNavigator basis

sh_smooth : float, optional

Lambda-regularization in the SH fit (default 0.0).

npeaks : int

Maximum number of peaks found (default 5 peaks).

B : ndarray, optional

Matrix that transforms spherical harmonics to spherical function sf = np.dot(sh, B).

invB : ndarray, optional

Inverse of B.

parallel: bool

If True, use multiprocessing to compute peaks and metric (default False). Temporary files are saved in the default temporary directory of the system. It can be changed using import tempfile and tempfile.tempdir = '/path/to/tempdir'.

nbr_processes: int

If parallel is True, the number of subprocesses to use (default multiprocessing.cpu_count()).

Returns:

pam : PeaksAndMetrics

An object with gfa, peak_directions, peak_values, peak_indices, odf, shm_coeffs as attributes

peaks_to_niftis

dipy.workflows.reconst.peaks_to_niftis(pam, fname_shm, fname_dirs, fname_values, fname_indices, fname_gfa, reshape_dirs=False)

Save SH, directions, indices and values of peaks to Nifti.

radial_diffusivity

dipy.workflows.reconst.radial_diffusivity(evals, axis=-1)
Radial Diffusivity (RD) of a diffusion tensor. Also called perpendicular diffusivity.
Parameters:

evals : array-like

Eigenvalues of a diffusion tensor, must be sorted in descending order along axis.

axis : int

Axis of evals which contains 3 eigenvalues.

Returns:

rd : array

Calculated RD.

rac{lambda_2 + lambda_3}{2}

read_bvals_bvecs

dipy.workflows.reconst.read_bvals_bvecs(fbvals, fbvecs)

Read b-values and b-vectors from disk

Parameters:

fbvals : str

Full path to file with b-values. None to not read bvals.

fbvecs : str

Full path of file with b-vectors. None to not read bvecs.

Returns:

bvals : array, (N,) or None

bvecs : array, (N, 3) or None

Notes

Files can be either ‘.bvals’/’.bvecs’ or ‘.txt’ or ‘.npy’ (containing arrays stored with the appropriate values).

save_peaks

dipy.workflows.reconst.save_peaks(fname, pam, affine=None, verbose=False)

Save all important attributes of object PeaksAndMetrics in a PAM5 file (HDF5).

Parameters:

fname : string

Filename of PAM5 file

pam : PeaksAndMetrics

Object holding peak_dirs, shm_coeffs and other attributes

affine : array

The 4x4 matrix transforming the date from native to world coordinates. PeaksAndMetrics should have that attribute but if not it can be provided here. Default None.

verbose : bool

Print summary information about the saved file.

split_dki_param

dipy.workflows.reconst.split_dki_param(dki_params)

Extract the diffusion tensor eigenvalues, the diffusion tensor eigenvector matrix, and the 15 independent elements of the kurtosis tensor from the model parameters estimated from the DKI model

Parameters:

dki_params : ndarray (x, y, z, 27) or (n, 27)

All parameters estimated from the diffusion kurtosis model. Parameters are ordered as follows:

  1. Three diffusion tensor’s eigenvalues
  2. Three lines of the eigenvector matrix each containing the first, second and third coordinates of the eigenvector
  3. Fifteen elements of the kurtosis tensor
Returns:

eigvals : array (x, y, z, 3) or (n, 3)

Eigenvalues from eigen decomposition of the tensor.

eigvecs : array (x, y, z, 3, 3) or (n, 3, 3)

Associated eigenvectors from eigen decomposition of the tensor. Eigenvectors are columnar (e.g. eigvecs[:,j] is associated with eigvals[j])

kt : array (x, y, z, 15) or (n, 15)

Fifteen elements of the kurtosis tensor

MedianOtsuFlow

class dipy.workflows.segment.MedianOtsuFlow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: dipy.workflows.workflow.Workflow

Methods

get_io_iterator() Create an iterator for IO.
get_short_name()
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run(input_files[, save_masked, ...]) Workflow wrapping the median_otsu segmentation method.
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

classmethod get_short_name()
run(input_files, save_masked=False, median_radius=2, numpass=5, autocrop=False, vol_idx=None, dilate=None, out_dir='', out_mask='brain_mask.nii.gz', out_masked='dwi_masked.nii.gz')

Workflow wrapping the median_otsu segmentation method.

Applies median_otsu segmentation on each file found by ‘globing’ input_files and saves the results in a directory specified by out_dir.

Parameters:

input_files : string

Path to the input volumes. This path may contain wildcards to process multiple inputs at once.

save_masked : bool

Save mask

median_radius : int, optional

Radius (in voxels) of the applied median filter (default 2)

numpass : int, optional

Number of pass of the median filter (default 5)

autocrop : bool, optional

If True, the masked input_volumes will also be cropped using the bounding box defined by the masked data. For example, if diffusion images are of 1x1x1 (mm^3) or higher resolution auto-cropping could reduce their size in memory and speed up some of the analysis. (default False)

vol_idx : variable int, optional

1D array representing indices of axis=3 of a 4D input_volume ‘None’ (the default) corresponds to (0,) (assumes first volume in 4D array). From cmd line use 3 4 5 6. From script use [3, 4, 5, 6].

dilate : int, optional

number of iterations for binary dilation (default ‘None’)

out_dir : string, optional

Output directory (default input file directory)

out_mask : string, optional

Name of the mask volume to be saved (default ‘brain_mask.nii.gz’)

out_masked : string, optional

Name of the masked volume to be saved (default ‘dwi_masked.nii.gz’)

Workflow

class dipy.workflows.segment.Workflow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: object

Methods

get_io_iterator() Create an iterator for IO.
get_short_name() A short name for the workflow used to subdivide
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run() Since this is an abstract class, raise exception if this code is
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

get_io_iterator()

Create an iterator for IO.

Use a couple of inspection tricks to build an IOIterator using the previous frame (values of local variables and other contextuals) and the run method’s docstring.

classmethod get_short_name()

A short name for the workflow used to subdivide

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

get_sub_runs()

No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

If it is bound to happen, an action is taken depending on self._force_overwrite (or –force via command line). A log message is output independently of the outcome to tell the user something happened.

run()

Since this is an abstract class, raise exception if this code is reached (not impletemented in child class or literally called on this class)

load_nifti

dipy.workflows.segment.load_nifti(fname, return_img=False, return_voxsize=False, return_coords=False)

median_otsu

dipy.workflows.segment.median_otsu(input_volume, median_radius=4, numpass=4, autocrop=False, vol_idx=None, dilate=None)

Simple brain extraction tool method for images from DWI data.

It uses a median filter smoothing of the input_volumes vol_idx and an automatic histogram Otsu thresholding technique, hence the name median_otsu.

This function is inspired from Mrtrix’s bet which has default values median_radius=3, numpass=2. However, from tests on multiple 1.5T and 3T data from GE, Philips, Siemens, the most robust choice is median_radius=4, numpass=4.

Parameters:

input_volume : ndarray

ndarray of the brain volume

median_radius : int

Radius (in voxels) of the applied median filter (default: 4).

numpass: int

Number of pass of the median filter (default: 4).

autocrop: bool, optional

if True, the masked input_volume will also be cropped using the bounding box defined by the masked data. Should be on if DWI is upsampled to 1x1x1 resolution. (default: False).

vol_idx : None or array, optional

1D array representing indices of axis=3 of a 4D input_volume None (the default) corresponds to (0,) (assumes first volume in 4D array).

dilate : None or int, optional

number of iterations for binary dilation

Returns:

maskedvolume : ndarray

Masked input_volume

mask : 3D ndarray

The binary brain mask

Notes

Copyright (C) 2011, the scikit-image team All rights reserved.

Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:

  1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
  2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
  3. Neither the name of skimage nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS’’ AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

save_nifti

dipy.workflows.segment.save_nifti(fname, data, affine, hdr=None)

DetTrackPAMFlow

class dipy.workflows.tracking.DetTrackPAMFlow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: dipy.workflows.tracking.GenericTrackFlow

Methods

get_io_iterator() Create an iterator for IO.
get_short_name()
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run(pam_files, stopping_files, seeding_files) Workflow for deterministic using a saved peaks and metrics (PAM) file as input.
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

classmethod get_short_name()
run(pam_files, stopping_files, seeding_files, stopping_thr=0.2, seed_density=1, use_sh=False, out_dir='', out_tractogram='tractogram.trk')

Workflow for deterministic using a saved peaks and metrics (PAM) file as input.

Parameters:

pam_files : string

Path to the peaks and metrics files. This path may contain

wildcards to use multiple masks at once.

stopping_files : string

Path of FA or other images used for stopping criteria for tracking.

seeding_files : string

A binary image showing where we need to seed for tracking.

stopping_thr : float, optional

Threshold applied to stopping volume’s data to identify where

tracking has to stop (default 0.25).

seed_density : int, optional

Number of seeds per dimension inside voxel (default 1).

For example, seed_density of 2 means 8 regularly distributed points in the voxel. And seed density of 1 means 1 point at the center of the voxel.

use_sh : bool, optional

Use spherical harmonics saved in peaks to find the

maximum peak cone. (default False)

out_dir : string, optional

Output directory (default input file directory)

out_tractogram : string, optional

Name of the tractogram file to be saved (default ‘tractogram.trk’)

References

Garyfallidis, University of Cambridge, PhD thesis 2012. Amirbekian, University of California San Francisco, PhD thesis 2017.

DeterministicMaximumDirectionGetter

class dipy.workflows.tracking.DeterministicMaximumDirectionGetter

Bases: dipy.direction.probabilistic_direction_getter.ProbabilisticDirectionGetter

Return direction of a sphere with the highest probability mass function (pmf).

Methods

from_pmf Constructor for making a DirectionGetter from an array of Pmfs
from_shcoeff Probabilistic direction getter from a distribution of directions
get_direction
initial_direction Returns best directions at seed location to start tracking.
__init__()

Initialize self. See help(type(self)) for accurate signature.

GenericTrackFlow

class dipy.workflows.tracking.GenericTrackFlow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: dipy.workflows.workflow.Workflow

Methods

get_io_iterator() Create an iterator for IO.
get_short_name() A short name for the workflow used to subdivide
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run() Since this is an abstract class, raise exception if this code is
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

LocalTracking

class dipy.workflows.tracking.LocalTracking(direction_getter, tissue_classifier, seeds, affine, step_size, max_cross=None, maxlen=500, fixedstep=True, return_all=True)

Bases: object

__init__(direction_getter, tissue_classifier, seeds, affine, step_size, max_cross=None, maxlen=500, fixedstep=True, return_all=True)

Creates streamlines by using local fiber-tracking.

Parameters:

direction_getter : instance of DirectionGetter

Used to get directions for fiber tracking.

tissue_classifier : instance of TissueClassifier

Identifies endpoints and invalid points to inform tracking.

seeds : array (N, 3)

Points to seed the tracking. Seed points should be given in point space of the track (see affine).

affine : array (4, 4)

Coordinate space for the streamline point with respect to voxel indices of input data. This affine can contain scaling, rotational, and translational components but should not contain any shearing. An identity matrix can be used to generate streamlines in “voxel coordinates” as long as isotropic voxels were used to acquire the data.

step_size : float

Step size used for tracking.

max_cross : int or None

The maximum number of direction to track from each seed in crossing voxels. By default all initial directions are tracked.

maxlen : int

Maximum number of steps to track from seed. Used to prevent infinite loops.

fixedstep : bool

If true, a fixed stepsize is used, otherwise a variable step size is used.

return_all : bool

If true, return all generated streamlines, otherwise only streamlines reaching end points or exiting the image.

ThresholdTissueClassifier

class dipy.workflows.tracking.ThresholdTissueClassifier

Bases: dipy.tracking.local.tissue_classifier.TissueClassifier

# Declarations from tissue_classifier.pxd bellow cdef:

double threshold, interp_out_double[1] double[:] interp_out_view = interp_out_view double[:, :, :] metric_map

Methods

check_point
__init__()

Initialize self. See help(type(self)) for accurate signature.

Tractogram

class dipy.workflows.tracking.Tractogram(streamlines=None, data_per_streamline=None, data_per_point=None, affine_to_rasmm=None)

Bases: object

Container for streamlines and their data information.

Streamlines of a tractogram can be in any coordinate system of your choice as long as you provide the correct affine_to_rasmm matrix, at construction time. When applied to streamlines coordinates, that transformation matrix should bring the streamlines back to world space (RAS+ and mm space) [1]_.

Moreover, when streamlines are mapped back to voxel space [2]_, a streamline point located at an integer coordinate (i,j,k) is considered to be at the center of the corresponding voxel. This is in contrast with other conventions where it might have referred to a corner.

References

[1] http://nipy.org/nibabel/coordinate_systems.html#naming-reference-spaces [2] http://nipy.org/nibabel/coordinate_systems.html#voxel-coordinates-are-in-voxel-space

Attributes

streamlines (ArraySequence object) Sequence of \(T\) streamlines. Each streamline is an ndarray of shape (\(N_t\), 3) where \(N_t\) is the number of points of streamline \(t\).
data_per_streamline (PerArrayDict object) Dictionary where the items are (str, 2D array). Each key represents a piece of information \(i\) to be kept alongside every streamline, and its associated value is a 2D array of shape (\(T\), \(P_i\)) where \(T\) is the number of streamlines and \(P_i\) is the number of values to store for that particular piece of information \(i\).
data_per_point (PerArraySequenceDict object) Dictionary where the items are (str, ArraySequence). Each key represents a piece of information \(i\) to be kept alongside every point of every streamline, and its associated value is an iterable of ndarrays of shape (\(N_t\), \(M_i\)) where \(N_t\) is the number of points for a particular streamline \(t\) and \(M_i\) is the number values to store for that particular piece of information \(i\).

Methods

apply_affine(affine[, lazy]) Applies an affine transformation on the points of each streamline.
copy() Returns a copy of this Tractogram object.
extend(other) Appends the data of another Tractogram.
to_world([lazy]) Brings the streamlines to world space (i.e.
__init__(streamlines=None, data_per_streamline=None, data_per_point=None, affine_to_rasmm=None)
Parameters:

streamlines : iterable of ndarrays or ArraySequence, optional

Sequence of \(T\) streamlines. Each streamline is an ndarray of shape (\(N_t\), 3) where \(N_t\) is the number of points of streamline \(t\).

data_per_streamline : dict of iterable of ndarrays, optional

Dictionary where the items are (str, iterable). Each key represents an information \(i\) to be kept alongside every streamline, and its associated value is an iterable of ndarrays of shape (\(P_i\),) where \(P_i\) is the number of scalar values to store for that particular information \(i\).

data_per_point : dict of iterable of ndarrays, optional

Dictionary where the items are (str, iterable). Each key represents an information \(i\) to be kept alongside every point of every streamline, and its associated value is an iterable of ndarrays of shape (\(N_t\), \(M_i\)) where \(N_t\) is the number of points for a particular streamline \(t\) and \(M_i\) is the number scalar values to store for that particular information \(i\).

affine_to_rasmm : ndarray of shape (4, 4) or None, optional

Transformation matrix that brings the streamlines contained in this tractogram to RAS+ and mm space where coordinate (0,0,0) refers to the center of the voxel. By default, the streamlines are in an unknown space, i.e. affine_to_rasmm is None.

affine_to_rasmm

Affine bringing streamlines in this tractogram to RAS+mm.

apply_affine(affine, lazy=False)

Applies an affine transformation on the points of each streamline.

If lazy is not specified, this is performed in-place.

Parameters:

affine : ndarray of shape (4, 4)

Transformation that will be applied to every streamline.

lazy : {False, True}, optional

If True, streamlines are not transformed in-place and a LazyTractogram object is returned. Otherwise, streamlines are modified in-place.

Returns:

tractogram : Tractogram or LazyTractogram object

Tractogram where the streamlines have been transformed according to the given affine transformation. If the lazy option is true, it returns a LazyTractogram object, otherwise it returns a reference to this Tractogram object with updated streamlines.

copy()

Returns a copy of this Tractogram object.

data_per_point
data_per_streamline
extend(other)

Appends the data of another Tractogram.

Data that will be appended includes the streamlines and the content of both dictionaries data_per_streamline and data_per_point.

Parameters:

other : Tractogram object

Its data will be appended to the data of this tractogram.

Returns:

None

Notes

The entries in both dictionaries self.data_per_streamline and self.data_per_point must match respectively those contained in the other tractogram.

streamlines
to_world(lazy=False)

Brings the streamlines to world space (i.e. RAS+ and mm).

If lazy is not specified, this is performed in-place.

Parameters:

lazy : {False, True}, optional

If True, streamlines are not transformed in-place and a LazyTractogram object is returned. Otherwise, streamlines are modified in-place.

Returns:

tractogram : Tractogram or LazyTractogram object

Tractogram where the streamlines have been sent to world space. If the lazy option is true, it returns a LazyTractogram object, otherwise it returns a reference to this Tractogram object with updated streamlines.

Workflow

class dipy.workflows.tracking.Workflow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: object

Methods

get_io_iterator() Create an iterator for IO.
get_short_name() A short name for the workflow used to subdivide
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run() Since this is an abstract class, raise exception if this code is
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

get_io_iterator()

Create an iterator for IO.

Use a couple of inspection tricks to build an IOIterator using the previous frame (values of local variables and other contextuals) and the run method’s docstring.

classmethod get_short_name()

A short name for the workflow used to subdivide

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

get_sub_runs()

No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

If it is bound to happen, an action is taken depending on self._force_overwrite (or –force via command line). A log message is output independently of the outcome to tell the user something happened.

run()

Since this is an abstract class, raise exception if this code is reached (not impletemented in child class or literally called on this class)

load_nifti

dipy.workflows.tracking.load_nifti(fname, return_img=False, return_voxsize=False, return_coords=False)

load_peaks

dipy.workflows.tracking.load_peaks(fname, verbose=False)

Load a PeaksAndMetrics HDF5 file (PAM5)

Parameters:

fname : string

Filename of PAM5 file.

verbose : bool

Print summary information about the loaded file.

Returns:

pam : PeaksAndMetrics object

save

dipy.workflows.tracking.save(tractogram, filename, **kwargs)

Saves a tractogram to a file.

Parameters:

tractogram : Tractogram object or TractogramFile object

If Tractogram object, the file format will be guessed from filename and a TractogramFile object will be created using provided keyword arguments. If TractogramFile object, the file format is known and will be used to save its content to filename.

filename : str

Name of the file where the tractogram will be saved.

**kwargs : keyword arguments

Keyword arguments passed to TractogramFile constructor. Should not be specified if tractogram is already an instance of TractogramFile.

Workflow

class dipy.workflows.workflow.Workflow(output_strategy='append', mix_names=False, force=False, skip=False)

Bases: object

Methods

get_io_iterator() Create an iterator for IO.
get_short_name() A short name for the workflow used to subdivide
get_sub_runs() No sub runs since this is a simple workflow.
manage_output_overwrite() Check if a file will be overwritten upon processing the inputs.
run() Since this is an abstract class, raise exception if this code is
__init__(output_strategy='append', mix_names=False, force=False, skip=False)

The basic workflow object.

This object takes care of any workflow operation that is common to all the workflows. Every new workflow should extend this class.

get_io_iterator()

Create an iterator for IO.

Use a couple of inspection tricks to build an IOIterator using the previous frame (values of local variables and other contextuals) and the run method’s docstring.

classmethod get_short_name()

A short name for the workflow used to subdivide

The short name is used by CombinedWorkflows and the argparser to subdivide the commandline parameters avoiding the trouble of having subworkflows parameters with the same name.

For example, a combined workflow with dti reconstruction and csd reconstruction might en up with the b0_threshold parameter. Using short names, we will have dti.b0_threshold and csd.b0_threshold available.

Returns class name by default but it is strongly advised to set it to something shorter and easier to write on commandline.

get_sub_runs()

No sub runs since this is a simple workflow.

manage_output_overwrite()

Check if a file will be overwritten upon processing the inputs.

If it is bound to happen, an action is taken depending on self._force_overwrite (or –force via command line). A log message is output independently of the outcome to tell the user something happened.

run()

Since this is an abstract class, raise exception if this code is reached (not impletemented in child class or literally called on this class)

io_iterator_

dipy.workflows.workflow.io_iterator_(frame, fnc, output_strategy='append', mix_names=False)

Creates an IOIterator using introspection.

Parameters:

frame : frameobject

Contains the info about the current local variables values.

fnc : function

The function to inspect

output_strategy : string

Controls the behavior of the IOIterator for output paths.

mix_names : bool

Whether or not to append a mix of input names at the beginning.

Returns

——-

Properly instantiated IOIterator object.