io

Dpy(fname[, mode, compression])

Methods

load_pickle(fname) Load object from pickle file fname
orientation_from_string(string_ornt) Returns an array representation of an ornt string
orientation_to_string(ornt) Returns a string representation of a 3d ornt
ornt_mapping(ornt1, ornt2) Calculates the mapping needing to get from orn1 to orn2
read_bvals_bvecs(fbvals, fbvecs) Read b-values and b-vectors from disk
read_bvec_file(filename[, atol]) Read gradient table information from a pair of files with extentions .bvec and .bval.
reorient_on_axis(input, current_ornt, new_ornt)
reorient_vectors(input, current_ornt, new_ornt) Changes the orientation of a gradients or other vectors
save_pickle(fname, dix) Save dix to fname as pickle

Module: io.bvectxt

orientation_from_string(string_ornt) Returns an array representation of an ornt string
orientation_to_string(ornt) Returns a string representation of a 3d ornt
ornt_mapping(ornt1, ornt2) Calculates the mapping needing to get from orn1 to orn2
read_bvec_file(filename[, atol]) Read gradient table information from a pair of files with extentions .bvec and .bval.
reorient_on_axis(input, current_ornt, new_ornt)
reorient_vectors(input, current_ornt, new_ornt) Changes the orientation of a gradients or other vectors
splitext(p) Split the extension from a pathname.

Module: io.dpy

A class for handling large tractography datasets.

It is built using the h5py which in turn implement key features of the HDF5 (hierachical data format) API [R61].

Dpy(fname[, mode, compression])

Methods

Streamlines alias of ArraySequence

Module: io.gradients

InTemporaryDirectory([suffix, prefix, dir]) Create, return, and change directory to a temporary directory
read_bvals_bvecs(fbvals, fbvecs) Read b-values and b-vectors from disk
splitext(p) Split the extension from a pathname.

Module: io.image

load_nifti(fname[, return_img, ...])
save_nifti(fname, data, affine[, hdr])

Module: io.peaks

PeaksAndMetrics

Attributes

Sphere([x, y, z, theta, phi, xyz, faces, edges]) Points on the unit sphere.
load_peaks(fname[, verbose]) Load a PeaksAndMetrics HDF5 file (PAM5)
peaks_to_niftis(pam, fname_shm, fname_dirs, ...) Save SH, directions, indices and values of peaks to Nifti.
reshape_peaks_for_visualization(peaks) Reshape peaks for visualization.
save_nifti(fname, data, affine[, hdr])
save_peaks(fname, pam[, affine, verbose]) Save all important attributes of object PeaksAndMetrics in a PAM5 file (HDF5).

Module: io.pickles

Load and save pickles

load_pickle(fname) Load object from pickle file fname
save_pickle(fname, dix) Save dix to fname as pickle

Module: io.streamline

Field Header fields common to multiple streamline file formats.
aff2axcodes(aff[, labels, tol]) axis direction codes for affine aff
load_trk(filename) Loads tractogram files(*.trk)
save_trk(fname, streamlines, affine[, ...]) Saves tractogram files (*.trk)

Module: io.trackvis

save_trk(filename, points, vox_to_ras, shape) A temporary helper function for saving trk files.

Module: io.utils

Utility functions for file formats

Nifti1Image(dataobj, affine[, header, ...]) Class for single file NIfTI1 format image
make5d(input) reshapes the input to have 5 dimensions, adds extra dimensions just
nifti1_symmat(image_data, *args, **kwargs) Returns a Nifti1Image with a symmetric matrix intent

Module: io.vtk

load_polydata(file_name) Load a vtk polydata to a supported format file
optional_package(name[, trip_msg]) Return package-like thing and module setup for package name
save_polydata(polydata, file_name[, binary, ...]) Save a vtk polydata to a supported format file
set_input(vtk_object, inp) Generic input function which takes into account VTK 5 or 6
setup_module()

Dpy

class dipy.io.Dpy(fname, mode='r', compression=0)

Bases: object

Methods

close()
read_track() read one track each time
read_tracks() read the entire tractography
read_tracksi(indices) read tracks with specific indices
version()
write_track(track) write on track each time
write_tracks(tracks) write many tracks together
__init__(fname, mode='r', compression=0)

Advanced storage system for tractography based on HDF5

Parameters:

fname : str, full filename

mode : ‘r’ read

‘w’ write ‘r+’ read and write only if file already exists

compression : 0 no compression to 9 maximum compression

Examples

>>> import os
>>> from tempfile import mkstemp #temp file
>>> from dipy.io.dpy import Dpy
>>> def dpy_example():
...     fd,fname = mkstemp()
...     fname += '.dpy'#add correct extension
...     dpw = Dpy(fname,'w')
...     A=np.ones((5,3))
...     B=2*A.copy()
...     C=3*A.copy()
...     dpw.write_track(A)
...     dpw.write_track(B)
...     dpw.write_track(C)
...     dpw.close()
...     dpr = Dpy(fname,'r')
...     dpr.read_track()
...     dpr.read_track()
...     dpr.read_tracksi([0, 1, 2, 0, 0, 2])
...     dpr.close()
...     os.remove(fname) #delete file from disk
>>> dpy_example()
close()
read_track()

read one track each time

read_tracks()

read the entire tractography

read_tracksi(indices)

read tracks with specific indices

version()
write_track(track)

write on track each time

write_tracks(tracks)

write many tracks together

load_pickle

dipy.io.load_pickle(fname)

Load object from pickle file fname

Parameters:

fname : str

filename to load dict or other python object

Returns:

dix : object

dictionary or other object

Examples

dipy.io.pickles.save_pickle

orientation_from_string

dipy.io.orientation_from_string(string_ornt)

Returns an array representation of an ornt string

orientation_to_string

dipy.io.orientation_to_string(ornt)

Returns a string representation of a 3d ornt

ornt_mapping

dipy.io.ornt_mapping(ornt1, ornt2)

Calculates the mapping needing to get from orn1 to orn2

read_bvals_bvecs

dipy.io.read_bvals_bvecs(fbvals, fbvecs)

Read b-values and b-vectors from disk

Parameters:

fbvals : str

Full path to file with b-values. None to not read bvals.

fbvecs : str

Full path of file with b-vectors. None to not read bvecs.

Returns:

bvals : array, (N,) or None

bvecs : array, (N, 3) or None

Notes

Files can be either ‘.bvals’/’.bvecs’ or ‘.txt’ or ‘.npy’ (containing arrays stored with the appropriate values).

read_bvec_file

dipy.io.read_bvec_file(filename, atol=0.001)

Read gradient table information from a pair of files with extentions .bvec and .bval. The bval file should have one row of values representing the bvalues of each volume in the dwi data set. The bvec file should have three rows, where the rows are the x, y, and z components of the normalized gradient direction for each of the volumes.

Parameters:

filename :

The path to the either the bvec or bval file

atol : float, optional

The tolorance used to check all the gradient directions are normalized. Defult is .001

reorient_on_axis

dipy.io.reorient_on_axis(input, current_ornt, new_ornt, axis=0)

reorient_vectors

dipy.io.reorient_vectors(input, current_ornt, new_ornt, axis=0)

Changes the orientation of a gradients or other vectors

Moves vectors, storted along axis, from current_ornt to new_ornt. For example the vector [x, y, z] in “RAS” will be [-x, -y, z] in “LPS”.

R: Right A: Anterior S: Superior L: Left P: Posterior I: Inferior

Examples

>>> gtab = np.array([[1, 1, 1], [1, 2, 3]])
>>> reorient_vectors(gtab, 'ras', 'asr', axis=1)
array([[1, 1, 1],
       [2, 3, 1]])
>>> reorient_vectors(gtab, 'ras', 'lps', axis=1)
array([[-1, -1,  1],
       [-1, -2,  3]])
>>> bvec = gtab.T
>>> reorient_vectors(bvec, 'ras', 'lps', axis=0)
array([[-1, -1],
       [-1, -2],
       [ 1,  3]])
>>> reorient_vectors(bvec, 'ras', 'lsp')
array([[-1, -1],
       [ 1,  3],
       [-1, -2]])

save_pickle

dipy.io.save_pickle(fname, dix)

Save dix to fname as pickle

Parameters:

fname : str

filename to save object e.g. a dictionary

dix : str

dictionary or other object

Examples

>>> import os
>>> from tempfile import mkstemp
>>> fd, fname = mkstemp() # make temporary file (opened, attached to fh)
>>> d={0:{'d':1}}
>>> save_pickle(fname, d)
>>> d2=load_pickle(fname)

We remove the temporary file we created for neatness

>>> os.close(fd) # the file is still open, we need to close the fh
>>> os.remove(fname)

orientation_from_string

dipy.io.bvectxt.orientation_from_string(string_ornt)

Returns an array representation of an ornt string

orientation_to_string

dipy.io.bvectxt.orientation_to_string(ornt)

Returns a string representation of a 3d ornt

ornt_mapping

dipy.io.bvectxt.ornt_mapping(ornt1, ornt2)

Calculates the mapping needing to get from orn1 to orn2

read_bvec_file

dipy.io.bvectxt.read_bvec_file(filename, atol=0.001)

Read gradient table information from a pair of files with extentions .bvec and .bval. The bval file should have one row of values representing the bvalues of each volume in the dwi data set. The bvec file should have three rows, where the rows are the x, y, and z components of the normalized gradient direction for each of the volumes.

Parameters:

filename :

The path to the either the bvec or bval file

atol : float, optional

The tolorance used to check all the gradient directions are normalized. Defult is .001

reorient_on_axis

dipy.io.bvectxt.reorient_on_axis(input, current_ornt, new_ornt, axis=0)

reorient_vectors

dipy.io.bvectxt.reorient_vectors(input, current_ornt, new_ornt, axis=0)

Changes the orientation of a gradients or other vectors

Moves vectors, storted along axis, from current_ornt to new_ornt. For example the vector [x, y, z] in “RAS” will be [-x, -y, z] in “LPS”.

R: Right A: Anterior S: Superior L: Left P: Posterior I: Inferior

Examples

>>> gtab = np.array([[1, 1, 1], [1, 2, 3]])
>>> reorient_vectors(gtab, 'ras', 'asr', axis=1)
array([[1, 1, 1],
       [2, 3, 1]])
>>> reorient_vectors(gtab, 'ras', 'lps', axis=1)
array([[-1, -1,  1],
       [-1, -2,  3]])
>>> bvec = gtab.T
>>> reorient_vectors(bvec, 'ras', 'lps', axis=0)
array([[-1, -1],
       [-1, -2],
       [ 1,  3]])
>>> reorient_vectors(bvec, 'ras', 'lsp')
array([[-1, -1],
       [ 1,  3],
       [-1, -2]])

splitext

dipy.io.bvectxt.splitext(p)

Split the extension from a pathname.

Extension is everything from the last dot to the end, ignoring leading dots. Returns “(root, ext)”; ext may be empty.

Dpy

class dipy.io.dpy.Dpy(fname, mode='r', compression=0)

Bases: object

Methods

close()
read_track() read one track each time
read_tracks() read the entire tractography
read_tracksi(indices) read tracks with specific indices
version()
write_track(track) write on track each time
write_tracks(tracks) write many tracks together
__init__(fname, mode='r', compression=0)

Advanced storage system for tractography based on HDF5

Parameters:

fname : str, full filename

mode : ‘r’ read

‘w’ write ‘r+’ read and write only if file already exists

compression : 0 no compression to 9 maximum compression

Examples

>>> import os
>>> from tempfile import mkstemp #temp file
>>> from dipy.io.dpy import Dpy
>>> def dpy_example():
...     fd,fname = mkstemp()
...     fname += '.dpy'#add correct extension
...     dpw = Dpy(fname,'w')
...     A=np.ones((5,3))
...     B=2*A.copy()
...     C=3*A.copy()
...     dpw.write_track(A)
...     dpw.write_track(B)
...     dpw.write_track(C)
...     dpw.close()
...     dpr = Dpy(fname,'r')
...     dpr.read_track()
...     dpr.read_track()
...     dpr.read_tracksi([0, 1, 2, 0, 0, 2])
...     dpr.close()
...     os.remove(fname) #delete file from disk
>>> dpy_example()
close()
read_track()

read one track each time

read_tracks()

read the entire tractography

read_tracksi(indices)

read tracks with specific indices

version()
write_track(track)

write on track each time

write_tracks(tracks)

write many tracks together

Streamlines

dipy.io.dpy.Streamlines

alias of ArraySequence

InTemporaryDirectory

class dipy.io.gradients.InTemporaryDirectory(suffix='', prefix='tmp', dir=None)

Bases: nibabel.tmpdirs.TemporaryDirectory

Create, return, and change directory to a temporary directory

Examples

>>> import os
>>> my_cwd = os.getcwd()
>>> with InTemporaryDirectory() as tmpdir:
...     _ = open('test.txt', 'wt').write('some text')
...     assert os.path.isfile('test.txt')
...     assert os.path.isfile(os.path.join(tmpdir, 'test.txt'))
>>> os.path.exists(tmpdir)
False
>>> os.getcwd() == my_cwd
True

Methods

cleanup()
__init__(suffix='', prefix='tmp', dir=None)

read_bvals_bvecs

dipy.io.gradients.read_bvals_bvecs(fbvals, fbvecs)

Read b-values and b-vectors from disk

Parameters:

fbvals : str

Full path to file with b-values. None to not read bvals.

fbvecs : str

Full path of file with b-vectors. None to not read bvecs.

Returns:

bvals : array, (N,) or None

bvecs : array, (N, 3) or None

Notes

Files can be either ‘.bvals’/’.bvecs’ or ‘.txt’ or ‘.npy’ (containing arrays stored with the appropriate values).

splitext

dipy.io.gradients.splitext(p)

Split the extension from a pathname.

Extension is everything from the last dot to the end, ignoring leading dots. Returns “(root, ext)”; ext may be empty.

load_nifti

dipy.io.image.load_nifti(fname, return_img=False, return_voxsize=False, return_coords=False)

save_nifti

dipy.io.image.save_nifti(fname, data, affine, hdr=None)

PeaksAndMetrics

class dipy.io.peaks.PeaksAndMetrics

Bases: dipy.reconst.peak_direction_getter.PeaksAndMetricsDirectionGetter

Attributes

ang_thr
qa_thr
total_weight

Methods

get_direction
initial_direction The best starting directions for fiber tracking from point
__init__()

Initialize self. See help(type(self)) for accurate signature.

Sphere

class dipy.io.peaks.Sphere(x=None, y=None, z=None, theta=None, phi=None, xyz=None, faces=None, edges=None)

Bases: object

Points on the unit sphere.

The sphere can be constructed using one of three conventions:

Sphere(x, y, z)
Sphere(xyz=xyz)
Sphere(theta=theta, phi=phi)
Parameters:

x, y, z : 1-D array_like

Vertices as x-y-z coordinates.

theta, phi : 1-D array_like

Vertices as spherical coordinates. Theta and phi are the inclination and azimuth angles respectively.

xyz : (N, 3) ndarray

Vertices as x-y-z coordinates.

faces : (N, 3) ndarray

Indices into vertices that form triangular faces. If unspecified, the faces are computed using a Delaunay triangulation.

edges : (N, 2) ndarray

Edges between vertices. If unspecified, the edges are derived from the faces.

Attributes

x
y
z

Methods

edges()
faces()
find_closest(xyz) Find the index of the vertex in the Sphere closest to the input vector
subdivide([n]) Subdivides each face of the sphere into four new faces.
vertices()
__init__(x=None, y=None, z=None, theta=None, phi=None, xyz=None, faces=None, edges=None)
edges()
faces()
find_closest(xyz)

Find the index of the vertex in the Sphere closest to the input vector

Parameters:

xyz : array-like, 3 elements

A unit vector

subdivide(n=1)

Subdivides each face of the sphere into four new faces.

New vertices are created at a, b, and c. Then each face [x, y, z] is divided into faces [x, a, c], [y, a, b], [z, b, c], and [a, b, c].

   y
   /               /               a/____
/\    /            /  \  /             /____\/____          x      c     z
Parameters:

n : int, optional

The number of subdivisions to preform.

Returns:

new_sphere : Sphere

The subdivided sphere.

vertices()
x
y
z

load_peaks

dipy.io.peaks.load_peaks(fname, verbose=False)

Load a PeaksAndMetrics HDF5 file (PAM5)

Parameters:

fname : string

Filename of PAM5 file.

verbose : bool

Print summary information about the loaded file.

Returns:

pam : PeaksAndMetrics object

peaks_to_niftis

dipy.io.peaks.peaks_to_niftis(pam, fname_shm, fname_dirs, fname_values, fname_indices, fname_gfa, reshape_dirs=False)

Save SH, directions, indices and values of peaks to Nifti.

reshape_peaks_for_visualization

dipy.io.peaks.reshape_peaks_for_visualization(peaks)

Reshape peaks for visualization.

Reshape and convert to float32 a set of peaks for visualisation with mrtrix or the fibernavigator.

save_nifti

dipy.io.peaks.save_nifti(fname, data, affine, hdr=None)

save_peaks

dipy.io.peaks.save_peaks(fname, pam, affine=None, verbose=False)

Save all important attributes of object PeaksAndMetrics in a PAM5 file (HDF5).

Parameters:

fname : string

Filename of PAM5 file

pam : PeaksAndMetrics

Object holding peak_dirs, shm_coeffs and other attributes

affine : array

The 4x4 matrix transforming the date from native to world coordinates. PeaksAndMetrics should have that attribute but if not it can be provided here. Default None.

verbose : bool

Print summary information about the saved file.

load_pickle

dipy.io.pickles.load_pickle(fname)

Load object from pickle file fname

Parameters:

fname : str

filename to load dict or other python object

Returns:

dix : object

dictionary or other object

Examples

dipy.io.pickles.save_pickle

save_pickle

dipy.io.pickles.save_pickle(fname, dix)

Save dix to fname as pickle

Parameters:

fname : str

filename to save object e.g. a dictionary

dix : str

dictionary or other object

Examples

>>> import os
>>> from tempfile import mkstemp
>>> fd, fname = mkstemp() # make temporary file (opened, attached to fh)
>>> d={0:{'d':1}}
>>> save_pickle(fname, d)
>>> d2=load_pickle(fname)

We remove the temporary file we created for neatness

>>> os.close(fd) # the file is still open, we need to close the fh
>>> os.remove(fname)

Field

class dipy.io.streamline.Field

Bases: object

Header fields common to multiple streamline file formats.

In IPython, use nibabel.streamlines.Field?? to list them.

__init__()

Initialize self. See help(type(self)) for accurate signature.

DIMENSIONS = 'dimensions'
ENDIANNESS = 'endianness'
MAGIC_NUMBER = 'magic_number'
METHOD = 'method'
NB_POINTS = 'nb_points'
NB_PROPERTIES_PER_STREAMLINE = 'nb_properties_per_streamline'
NB_SCALARS_PER_POINT = 'nb_scalars_per_point'
NB_STREAMLINES = 'nb_streamlines'
ORIGIN = 'origin'
STEP_SIZE = 'step_size'
VOXEL_ORDER = 'voxel_order'
VOXEL_SIZES = 'voxel_sizes'
VOXEL_TO_RASMM = 'voxel_to_rasmm'

aff2axcodes

dipy.io.streamline.aff2axcodes(aff, labels=None, tol=None)

axis direction codes for affine aff

Parameters:

aff : (N,M) array-like

affine transformation matrix

labels : optional, None or sequence of (2,) sequences

Labels for negative and positive ends of output axes of aff. See docstring for ornt2axcodes for more detail

tol : None or float

Tolerance for SVD of affine - see io_orientation for more detail.

Returns:

axcodes : (N,) tuple

labels for positive end of voxel axes. Dropped axes get a label of None.

Examples

>>> aff = [[0,1,0,10],[-1,0,0,20],[0,0,1,30],[0,0,0,1]]
>>> aff2axcodes(aff, (('L','R'),('B','F'),('D','U')))
('B', 'R', 'U')

load_trk

dipy.io.streamline.load_trk(filename)

Loads tractogram files(*.trk)

Parameters:

filename : str

input trk filename

Returns:

streamlines : list of 2D arrays

Each 2D array represents a sequence of 3D points (points, 3).

hdr : dict

header from a trk file

save_trk

dipy.io.streamline.save_trk(fname, streamlines, affine, vox_size=None, shape=None, header=None)

Saves tractogram files (*.trk)

Parameters:

fname : str

output trk filename

streamlines : list of 2D arrays, generator or ArraySequence

Each 2D array represents a sequence of 3D points (points, 3).

affine : array_like (4, 4)

The mapping from voxel coordinates to streamline points.

vox_size : array_like (3,), optional

The sizes of the voxels in the reference image (default: None)

shape : array, shape (dim,), optional

The shape of the reference image (default: None)

header : dict, optional

Metadata associated to the tractogram file(*.trk). (default: None)

save_trk

dipy.io.trackvis.save_trk(filename, points, vox_to_ras, shape)

A temporary helper function for saving trk files.

This function will soon be replaced by better trk file support in nibabel.

Nifti1Image

class dipy.io.utils.Nifti1Image(dataobj, affine, header=None, extra=None, file_map=None)

Bases: nibabel.nifti1.Nifti1Pair

Class for single file NIfTI1 format image

Attributes

affine
dataobj
header
in_memory True when any array data is in memory cache
shape

Methods

ImageArrayProxy alias of ArrayProxy
as_reoriented(ornt) Apply an orientation change and return a new image
filespec_to_file_map(klass, filespec) Make file_map for this class from filename filespec
filespec_to_files(klass, filespec) filespec_to_files class method is deprecated.
from_file_map(klass, file_map[, mmap, ...]) class method to create image from mapping in file_map `
from_filename(klass, filename[, mmap, ...]) class method to create image from filename filename
from_files(klass, file_map) from_files class method is deprecated.
from_image(klass, img) Class method to create new instance of own class from img
get_affine() Get affine from image
get_data([caching]) Return image data from image with any necessary scaling applied
get_data_dtype()
get_fdata([caching, dtype]) Return floating point image data with necessary scaling applied
get_filename() Fetch the image filename
get_header() Get header from image
get_qform([coded]) Return 4x4 affine matrix from qform parameters in header
get_sform([coded]) Return 4x4 affine matrix from sform parameters in header
get_shape() Return shape for image
header_class alias of Nifti1Header
instance_to_filename(klass, img, filename) Save img in our own format, to name implied by filename
load(klass, filename[, mmap, keep_file_open]) class method to create image from filename filename
make_file_map(klass[, mapping]) Class method to make files holder for this image type
orthoview() Plot the image using OrthoSlicer3D
path_maybe_image(klass, filename[, sniff, ...]) Return True if filename may be image matching this class
set_data_dtype(dtype)
set_filename(filename) Sets the files in the object from a given filename
set_qform(affine[, code, strip_shears]) Set qform header values from 4x4 affine
set_sform(affine[, code]) Set sform transform from 4x4 affine
to_file_map([file_map]) Write image to file_map or contained self.file_map
to_filename(filename) Write image to files implied by filename string
to_files([file_map]) to_files method is deprecated.
to_filespec(filename) to_filespec method is deprecated.
uncache() Delete any cached read of data from proxied data
update_header() Harmonize header with image data and affine
__init__(dataobj, affine, header=None, extra=None, file_map=None)

Initialize image

The image is a combination of (array-like, affine matrix, header), with optional metadata in extra, and filename / file-like objects contained in the file_map mapping.
Parameters:

dataobj : object

Object containg image data. It should be some object that retuns an array from np.asanyarray. It should have a shape attribute or property

affine : None or (4,4) array-like

homogenous affine giving relationship between voxel coordinates and world coordinates. Affine can also be None. In this case, obj.affine also returns None, and the affine as written to disk will depend on the file format.

header : None or mapping or header instance, optional

metadata for this image format

extra : None or mapping, optional

metadata to associate with image that cannot be stored in the metadata of this image type

file_map : mapping, optional

mapping giving file information for this image format

Notes

If both a header and an affine are specified, and the affine does not match the affine that is in the header, the affine will be used, but the sform_code and qform_code fields in the header will be re-initialised to their default values. This is performed on the basis that, if you are changing the affine, you are likely to be changing the space to which the affine is pointing. The set_sform() and set_qform() methods can be used to update the codes after an image has been created - see those methods, and the manual for more details.

files_types = (('image', '.nii'),)
header_class

alias of Nifti1Header

update_header()

Harmonize header with image data and affine

valid_exts = ('.nii',)

make5d

dipy.io.utils.make5d(input)

reshapes the input to have 5 dimensions, adds extra dimensions just before the last dimession

nifti1_symmat

dipy.io.utils.nifti1_symmat(image_data, *args, **kwargs)

Returns a Nifti1Image with a symmetric matrix intent

load_polydata

dipy.io.vtk.load_polydata(file_name)

Load a vtk polydata to a supported format file

Supported file formats are OBJ, VTK, FIB, PLY, STL and XML

Parameters:file_name : string
Returns:output : vtkPolyData

optional_package

dipy.io.vtk.optional_package(name, trip_msg=None)

Return package-like thing and module setup for package name

Parameters:

name : str

package name

trip_msg : None or str

message to give when someone tries to use the return package, but we could not import it, and have returned a TripWire object instead. Default message if None.

Returns:

pkg_like : module or TripWire instance

If we can import the package, return it. Otherwise return an object raising an error when accessed

have_pkg : bool

True if import for package was successful, false otherwise

module_setup : function

callable usually set as setup_module in calling namespace, to allow skipping tests.

save_polydata

dipy.io.vtk.save_polydata(polydata, file_name, binary=False, color_array_name=None)

Save a vtk polydata to a supported format file

Save formats can be VTK, FIB, PLY, STL and XML.

Parameters:

polydata : vtkPolyData

file_name : string

set_input

dipy.io.vtk.set_input(vtk_object, inp)

Generic input function which takes into account VTK 5 or 6

Parameters:

vtk_object: vtk object

inp: vtkPolyData or vtkImageData or vtkAlgorithmOutput

Returns:

vtk_object

Notes

This can be used in the following way::
from dipy.viz.utils import set_input poly_mapper = set_input(vtk.vtkPolyDataMapper(), poly_data)

setup_module

dipy.io.vtk.setup_module()