nipype.interfaces.io module

Set of interfaces that allow interaction with data. Currently available interfaces are:

DataSource: Generic nifti to named Nifti interface DataSink: Generic named output from interfaces to data store XNATSource: preliminary interface to XNAT

To come : XNATSink

BIDSDataGrabber

Link to code

Bases: LibraryBaseInterface, IOBase

BIDS datagrabber module that wraps around pybids to allow arbitrary querying of BIDS datasets.

Examples

By default, the BIDSDataGrabber fetches anatomical and functional images from a project, and makes BIDS entities (e.g. subject) available for filtering outputs.

>>> bg = BIDSDataGrabber() 
>>> bg.inputs.base_dir = 'ds005/' 
>>> bg.inputs.subject = '01' 
>>> results = bg.run() 

Dynamically created, user-defined output fields can also be defined to return different types of outputs from the same project. All outputs are filtered on common entities, which can be explicitly defined as infields.

>>> bg = BIDSDataGrabber(infields = ['subject']) 
>>> bg.inputs.base_dir = 'ds005/' 
>>> bg.inputs.subject = '01' 
>>> bg.inputs.output_query['dwi'] = dict(datatype='dwi') 
>>> results = bg.run() 
base_dira pathlike object or string representing an existing directory

Path to BIDS Directory.

index_derivativesa boolean

Index derivatives/ sub-directory. (Nipype default value: False)

extra_derivativesa list of items which are a pathlike object or string representing an existing directory

Additional derivative directories to index.

load_layouta pathlike object or string representing an existing directory

Path to load already saved Bidslayout.

output_querya dictionary with keys which are a string and with values which are a dictionary with keys which are any value and with values which are any value

Queries for outfield outputs.

raise_on_emptya boolean

Generate exception if list is empty for a given field. (Nipype default value: True)

BIDSDataGrabber.input_spec

alias of nipype.interfaces.io.BIDSDataGrabberInputSpec

BIDSDataGrabber.output_spec

alias of nipype.interfaces.base.specs.DynamicTraitedSpec

class nipype.interfaces.io.BIDSDataGrabberInputSpec(**kwargs)

Bases: nipype.interfaces.base.specs.DynamicTraitedSpec

DataFinder

Link to code

Bases: IOBase

Search for paths that match a given regular expression. Allows a less proscriptive approach to gathering input files compared to DataGrabber. Will recursively search any subdirectories by default. This can be limited with the min/max depth options. Matched paths are available in the output ‘out_paths’. Any named groups of captured text from the regular expression are also available as ouputs of the same name.

Examples

>>> from nipype.interfaces.io import DataFinder
>>> df = DataFinder()
>>> df.inputs.root_paths = '.'
>>> df.inputs.match_regex = r'.+/(?P<series_dir>.+(qT1|ep2d_fid_T1).+)/(?P<basename>.+)\.nii.gz'
>>> result = df.run() 
>>> result.outputs.out_paths  
['./027-ep2d_fid_T1_Gd4/acquisition.nii.gz',
 './018-ep2d_fid_T1_Gd2/acquisition.nii.gz',
 './016-ep2d_fid_T1_Gd1/acquisition.nii.gz',
 './013-ep2d_fid_T1_pre/acquisition.nii.gz']
>>> result.outputs.series_dir  
['027-ep2d_fid_T1_Gd4',
 '018-ep2d_fid_T1_Gd2',
 '016-ep2d_fid_T1_Gd1',
 '013-ep2d_fid_T1_pre']
>>> result.outputs.basename  
['acquisition',
 'acquisition'
 'acquisition',
 'acquisition']

root_paths : a list of items which are any value or a string

ignore_regexesa list of items which are any value

List of regular expressions, if any match the path it will be ignored.

match_regexa string

Regular expression for matching paths. (Nipype default value: (.+))

max_depthan integer

The maximum depth to search beneath the root_paths.

min_depthan integer

The minimum depth to search beneath the root paths.

unpack_singlea boolean

Unpack single results from list. (Nipype default value: False)

DataFinder.output_spec

alias of nipype.interfaces.base.specs.DynamicTraitedSpec

DataGrabber

Link to code

Bases: IOBase

Find files on a filesystem.

Generic datagrabber module that wraps around glob in an intelligent way for neuroimaging tasks to grab files

Important

Doesn’t support directories currently

Examples

>>> from nipype.interfaces.io import DataGrabber

Pick all files from current directory

>>> dg = DataGrabber()
>>> dg.inputs.template = '*'

Pick file foo/foo.nii from current directory

>>> dg.inputs.template = '%s/%s.dcm'
>>> dg.inputs.template_args['outfiles']=[['dicomdir','123456-1-1.dcm']]

Same thing but with dynamically created fields

>>> dg = DataGrabber(infields=['arg1','arg2'])
>>> dg.inputs.template = '%s/%s.nii'
>>> dg.inputs.arg1 = 'foo'
>>> dg.inputs.arg2 = 'foo'

however this latter form can be used with iterables and iterfield in a pipeline.

Dynamically created, user-defined input and output fields

>>> dg = DataGrabber(infields=['sid'], outfields=['func','struct','ref'])
>>> dg.inputs.base_directory = '.'
>>> dg.inputs.template = '%s/%s.nii'
>>> dg.inputs.template_args['func'] = [['sid',['f3','f5']]]
>>> dg.inputs.template_args['struct'] = [['sid',['struct']]]
>>> dg.inputs.template_args['ref'] = [['sid','ref']]
>>> dg.inputs.sid = 's1'

Change the template only for output field struct. The rest use the general template

>>> dg.inputs.field_template = dict(struct='%s/struct.nii')
>>> dg.inputs.template_args['struct'] = [['sid']]
sort_filelista boolean

Sort the filelist that matches the template.

templatea string

Layout used to get files. relative to base directory if defined.

base_directorya pathlike object or string representing an existing directory

Path to the base directory consisting of subject data.

drop_blank_outputsa boolean

Remove None entries from output lists. (Nipype default value: False)

raise_on_emptya boolean

Generate exception if list is empty for a given field. (Nipype default value: True)

template_argsa dictionary with keys which are a string and with values which are a list of items which are a list of items which are any value

Information to plug into template.

DataGrabber.output_spec

alias of nipype.interfaces.base.specs.DynamicTraitedSpec

DataSink

Link to code

Bases: IOBase

Generic datasink module to store structured outputs.

Primarily for use within a workflow. This interface allows arbitrary creation of input attributes. The names of these attributes define the directory structure to create for storage of the files or directories.

The attributes take the following form:

string[[.[@]]string[[.[@]]string]] ...

where parts between [] are optional.

An attribute such as contrasts.@con will create a ‘contrasts’ directory to store the results linked to the attribute. If the @ is left out, such as in ‘contrasts.con’, a subdirectory ‘con’ will be created under ‘contrasts’.

The general form of the output is:

'base_directory/container/parameterization/destloc/filename'

destloc = string[[.[@]]string[[.[@]]string]] and filename come from the input to the connect statement.

Warning

This is not a thread-safe node because it can write to a common shared location. It will not complain when it overwrites a file.

Note

If both substitutions and regexp_substitutions are used, then substitutions are applied first followed by regexp_substitutions.

This interface cannot be used in a MapNode as the inputs are defined only when the connect statement is executed.

Examples

>>> ds = DataSink()
>>> ds.inputs.base_directory = 'results_dir'
>>> ds.inputs.container = 'subject'
>>> ds.inputs.structural = 'structural.nii'
>>> setattr(ds.inputs, 'contrasts.@con', ['cont1.nii', 'cont2.nii'])
>>> setattr(ds.inputs, 'contrasts.alt', ['cont1a.nii', 'cont2a.nii'])
>>> ds.run()  

To use DataSink in a MapNode, its inputs have to be defined at the time the interface is created.

>>> ds = DataSink(infields=['contasts.@con'])
>>> ds.inputs.base_directory = 'results_dir'
>>> ds.inputs.container = 'subject'
>>> ds.inputs.structural = 'structural.nii'
>>> setattr(ds.inputs, 'contrasts.@con', ['cont1.nii', 'cont2.nii'])
>>> setattr(ds.inputs, 'contrasts.alt', ['cont1a.nii', 'cont2a.nii'])
>>> ds.run()  
_outputsa dictionary with keys which are a string and with values which are any value

(Nipype default value: {})

base_directorya string

Path to the base directory for storing data.

bucketany value

Boto3 S3 bucket for manual override of bucket.

containera string

Folder within base directory in which to store output.

creds_patha string

Filepath to AWS credentials file for S3 bucket access; if not specified, the credentials will be taken from the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables.

encrypt_bucket_keysa boolean

Flag indicating whether to use S3 server-side AES-256 encryption.

local_copya string

Copy files locally as well as to S3 bucket.

parameterizationa boolean

Store output in parametrized structure. (Nipype default value: True)

regexp_substitutionsa list of items which are a tuple of the form: (a string, a string)

List of 2-tuples reflecting a pair of a Python regexp pattern and a replacement string. Invoked after string substitutions.

remove_dest_dira boolean

Remove dest directory when copying dirs. (Nipype default value: False)

strip_dira string

Path to strip out of filename.

substitutionsa list of items which are a tuple of the form: (a string, a string)

List of 2-tuples reflecting string to substitute and string to replace it with.

out_fileany value

Datasink output.

ExportFile

Link to code

Bases: SimpleInterface

Export a file to an absolute path.

This interface copies an input file to a named output file. This is useful to save individual files to a specific location, instead of more flexible interfaces like DataSink.

Examples

>>> from nipype.interfaces.io import ExportFile
>>> import os.path as op
>>> ef = ExportFile()
>>> ef.inputs.in_file = "T1.nii.gz"
>>> os.mkdir("output_folder")
>>> ef.inputs.out_file = op.abspath("output_folder/sub1_out.nii.gz")
>>> res = ef.run()
>>> os.path.exists(res.outputs.out_file)
True
in_filea pathlike object or string representing an existing file

Input file name.

out_filea pathlike object or string representing a file

Output file name.

check_extensiona boolean

Ensure that the input and output file extensions match. (Nipype default value: True)

clobbera boolean

Permit overwriting existing files.

out_filea pathlike object or string representing an existing file

Output file name.

FreeSurferSource

Link to code

Bases: IOBase

Generates freesurfer subject info from their directories.

Examples

>>> from nipype.interfaces.io import FreeSurferSource
>>> fs = FreeSurferSource()
>>> #fs.inputs.subjects_dir = '.'
>>> fs.inputs.subject_id = 'PWS04'
>>> res = fs.run() 
>>> fs.inputs.hemi = 'lh'
>>> res = fs.run() 
subject_ida string

Subject name for whom to retrieve data.

subjects_dira pathlike object or string representing an existing directory

Freesurfer subjects directory.

hemi‘both’ or ‘lh’ or ‘rh’

Selects hemisphere specific outputs. (Nipype default value: both)

BA_statsa list of items which are a pathlike object or string representing an existing file

Brodmann Area statistics files.

T1a pathlike object or string representing an existing file

Intensity normalized whole-head volume.

annota list of items which are a pathlike object or string representing an existing file

Surface annotation files.

aparc_a2009s_statsa list of items which are a pathlike object or string representing an existing file

Aparc a2009s parcellation statistics files.

aparc_asega list of items which are a pathlike object or string representing an existing file

Aparc parcellation projected into aseg volume.

aparc_statsa list of items which are a pathlike object or string representing an existing file

Aparc parcellation statistics files.

area_piala list of items which are a pathlike object or string representing an existing file

Mean area of triangles each vertex on the pial surface is associated with.

asega pathlike object or string representing an existing file

Volumetric map of regions from automatic segmentation.

aseg_statsa list of items which are a pathlike object or string representing an existing file

Automated segmentation statistics file.

avg_curva list of items which are a pathlike object or string representing an existing file

Average atlas curvature, sampled to subject.

braina pathlike object or string representing an existing file

Intensity normalized brain-only volume.

brainmaska pathlike object or string representing an existing file

Skull-stripped (brain-only) volume.

curva list of items which are a pathlike object or string representing an existing file

Maps of surface curvature.

curv_piala list of items which are a pathlike object or string representing an existing file

Curvature of pial surface.

curv_statsa list of items which are a pathlike object or string representing an existing file

Curvature statistics files.

entorhinal_exvivo_statsa list of items which are a pathlike object or string representing an existing file

Entorhinal exvivo statistics files.

filleda pathlike object or string representing an existing file

Subcortical mass volume.

graymida list of items which are a pathlike object or string representing an existing file

Graymid/midthickness surface meshes.

inflateda list of items which are a pathlike object or string representing an existing file

Inflated surface meshes.

jacobian_whitea list of items which are a pathlike object or string representing an existing file

Distortion required to register to spherical atlas.

labela list of items which are a pathlike object or string representing an existing file

Volume and surface label files.

norma pathlike object or string representing an existing file

Normalized skull-stripped volume.

nua pathlike object or string representing an existing file

Non-uniformity corrected whole-head volume.

origa pathlike object or string representing an existing file

Base image conformed to Freesurfer space.

piala list of items which are a pathlike object or string representing an existing file

Gray matter/pia mater surface meshes.

rawavga pathlike object or string representing an existing file

Volume formed by averaging input images.

ribbona list of items which are a pathlike object or string representing an existing file

Volumetric maps of cortical ribbons.

smoothwma list of items which are a pathlike object or string representing an existing file

Smoothed original surface meshes.

spherea list of items which are a pathlike object or string representing an existing file

Spherical surface meshes.

sphere_rega list of items which are a pathlike object or string representing an existing file

Spherical registration file.

sulca list of items which are a pathlike object or string representing an existing file

Surface maps of sulcal depth.

thicknessa list of items which are a pathlike object or string representing an existing file

Surface maps of cortical thickness.

volumea list of items which are a pathlike object or string representing an existing file

Surface maps of cortical volume.

whitea list of items which are a pathlike object or string representing an existing file

White/gray matter surface meshes.

wma pathlike object or string representing an existing file

Segmented white-matter volume.

wmparca pathlike object or string representing an existing file

Aparc parcellation projected into subcortical white matter.

wmparc_statsa list of items which are a pathlike object or string representing an existing file

White matter parcellation statistics file.

IOBase

Link to code

JSONFileGrabber

Link to code

Bases: IOBase

Datagrabber interface that loads a json file and generates an output for every first-level object

Example

>>> import pprint
>>> from nipype.interfaces.io import JSONFileGrabber
>>> jsonSource = JSONFileGrabber()
>>> jsonSource.inputs.defaults = {'param1': 'overrideMe', 'param3': 1.0}
>>> res = jsonSource.run()
>>> pprint.pprint(res.outputs.get())
{'param1': 'overrideMe', 'param3': 1.0}
>>> jsonSource.inputs.in_file = os.path.join(datadir, 'jsongrabber.txt')
>>> res = jsonSource.run()
>>> pprint.pprint(res.outputs.get())  
{'param1': 'exampleStr', 'param2': 4, 'param3': 1.0}
defaultsa dictionary with keys which are any value and with values which are any value

JSON dictionary that sets default outputvalues, overridden by values found in in_file.

in_filea pathlike object or string representing an existing file

JSON source file.

JSONFileGrabber.output_spec

alias of nipype.interfaces.base.specs.DynamicTraitedSpec

JSONFileSink

Link to code

Bases: IOBase

Very simple frontend for storing values into a JSON file. Entries already existing in in_dict will be overridden by matching entries dynamically added as inputs.

Warning

This is not a thread-safe node because it can write to a common shared location. It will not complain when it overwrites a file.

Examples

>>> jsonsink = JSONFileSink(input_names=['subject_id',
...                         'some_measurement'])
>>> jsonsink.inputs.subject_id = 's1'
>>> jsonsink.inputs.some_measurement = 11.4
>>> jsonsink.run() 

Using a dictionary as input:

>>> dictsink = JSONFileSink()
>>> dictsink.inputs.in_dict = {'subject_id': 's1',
...                            'some_measurement': 11.4}
>>> dictsink.run() 
_outputsa dictionary with keys which are any value and with values which are any value

(Nipype default value: {})

in_dicta dictionary with keys which are any value and with values which are any value

Input JSON dictionary. (Nipype default value: {})

out_filea pathlike object or string representing a file

JSON sink file.

out_filea pathlike object or string representing a file

JSON sink file.

MySQLSink

Link to code

Bases: IOBase

Very simple frontend for storing values into MySQL database.

Examples

>>> sql = MySQLSink(input_names=['subject_id', 'some_measurement'])
>>> sql.inputs.database_name = 'my_database'
>>> sql.inputs.table_name = 'experiment_results'
>>> sql.inputs.username = 'root'
>>> sql.inputs.password = 'secret'
>>> sql.inputs.subject_id = 's1'
>>> sql.inputs.some_measurement = 11.4
>>> sql.run() 
configa pathlike object or string representing a file

MySQL Options File (same format as my.cnf). Mutually exclusive with inputs: host.

database_namea string

Otherwise known as the schema name.

hosta string

Mutually exclusive with inputs: config. Requires inputs: username, password. (Nipype default value: localhost)

table_name : a string

password : a string username : a string

class nipype.interfaces.io.ProgressPercentage(filename)

Bases: object

Callable class instsance (via __call__ method) that displays upload percentage of a file to S3

S3DataGrabber

Link to code

Bases: LibraryBaseInterface, IOBase

Pull data from an Amazon S3 Bucket.

Generic datagrabber module that wraps around glob in an intelligent way for neuroimaging tasks to grab files from Amazon S3

Works exactly like DataGrabber, except, you must specify an S3 “bucket” and “bucket_path” to search for your data and a “local_directory” to store the data. “local_directory” should be a location on HDFS for Spark jobs. Additionally, “template” uses regex style formatting, rather than the glob-style found in the original DataGrabber.

Examples

>>> s3grab = S3DataGrabber(infields=['subj_id'], outfields=["func", "anat"])
>>> s3grab.inputs.bucket = 'openneuro'
>>> s3grab.inputs.sort_filelist = True
>>> s3grab.inputs.template = '*'
>>> s3grab.inputs.anon = True
>>> s3grab.inputs.bucket_path = 'ds000101/ds000101_R2.0.0/uncompressed/'
>>> s3grab.inputs.local_directory = '/tmp'
>>> s3grab.inputs.field_template = {'anat': '%s/anat/%s_T1w.nii.gz',
...                                 'func': '%s/func/%s_task-simon_run-1_bold.nii.gz'}
>>> s3grab.inputs.template_args = {'anat': [['subj_id', 'subj_id']],
...                                'func': [['subj_id', 'subj_id']]}
>>> s3grab.inputs.subj_id = 'sub-01'
>>> s3grab.run()  
bucketa string

Amazon S3 bucket where your data is stored.

sort_filelista boolean

Sort the filelist that matches the template.

templatea string

Layout used to get files. Relative to bucket_path if defined.Uses regex rather than glob style formatting.

anona boolean

Use anonymous connection to s3. If this is set to True, boto may print a urlopen error, but this does not prevent data from being downloaded. (Nipype default value: False)

bucket_patha string

Location within your bucket for subject data. (Nipype default value: "")

local_directorya pathlike object or string representing an existing directory

Path to the local directory for subject data to be downloaded and accessed. Should be on HDFS for Spark jobs.

raise_on_emptya boolean

Generate exception if list is empty for a given field. (Nipype default value: True)

regiona string

Region of s3 bucket. (Nipype default value: us-east-1)

template_argsa dictionary with keys which are a string and with values which are a list of items which are a list of items which are any value

Information to plug into template.

S3DataGrabber.output_spec

alias of nipype.interfaces.base.specs.DynamicTraitedSpec

S3DataGrabber.s3tolocal(s3path, bkt)

SQLiteSink

Link to code

Bases: LibraryBaseInterface, IOBase

Very simple frontend for storing values into SQLite database.

Warning

This is not a thread-safe node because it can write to a common shared location. It will not complain when it overwrites a file.

Examples

>>> sql = SQLiteSink(input_names=['subject_id', 'some_measurement'])
>>> sql.inputs.database_file = 'my_database.db'
>>> sql.inputs.table_name = 'experiment_results'
>>> sql.inputs.subject_id = 's1'
>>> sql.inputs.some_measurement = 11.4
>>> sql.run() 

database_file : a pathlike object or string representing an existing file table_name : a string

SSHDataGrabber

Link to code

Bases: LibraryBaseInterface, DataGrabber

Extension of DataGrabber module that downloads the file list and optionally the files from a SSH server. The SSH operation must not need user and password so an SSH agent must be active in where this module is being run.

Attention

Doesn’t support directories currently

Examples

>>> from nipype.interfaces.io import SSHDataGrabber
>>> dg = SSHDataGrabber()
>>> dg.inputs.hostname = 'test.rebex.net'
>>> dg.inputs.user = 'demo'
>>> dg.inputs.password = 'password'
>>> dg.inputs.base_directory = 'pub/example'

Pick all files from the base directory

>>> dg.inputs.template = '*'

Pick all files starting with “s” and a number from current directory

>>> dg.inputs.template_expression = 'regexp'
>>> dg.inputs.template = 'pop[0-9].*'

Same thing but with dynamically created fields

>>> dg = SSHDataGrabber(infields=['arg1','arg2'])
>>> dg.inputs.hostname = 'test.rebex.net'
>>> dg.inputs.user = 'demo'
>>> dg.inputs.password = 'password'
>>> dg.inputs.base_directory = 'pub'
>>> dg.inputs.template = '%s/%s.txt'
>>> dg.inputs.arg1 = 'example'
>>> dg.inputs.arg2 = 'foo'

however this latter form can be used with iterables and iterfield in a pipeline.

Dynamically created, user-defined input and output fields

>>> dg = SSHDataGrabber(infields=['sid'], outfields=['func','struct','ref'])
>>> dg.inputs.hostname = 'myhost.com'
>>> dg.inputs.base_directory = '/main_folder/my_remote_dir'
>>> dg.inputs.template_args['func'] = [['sid',['f3','f5']]]
>>> dg.inputs.template_args['struct'] = [['sid',['struct']]]
>>> dg.inputs.template_args['ref'] = [['sid','ref']]
>>> dg.inputs.sid = 's1'

Change the template only for output field struct. The rest use the general template

>>> dg.inputs.field_template = dict(struct='%s/struct.nii')
>>> dg.inputs.template_args['struct'] = [['sid']]
base_directorya string

Path to the base directory consisting of subject data.

hostnamea string

Server hostname.

sort_filelista boolean

Sort the filelist that matches the template.

templatea string

Layout used to get files. relative to base directory if defined.

download_filesa boolean

If false it will return the file names without downloading them. (Nipype default value: True)

drop_blank_outputsa boolean

Remove None entries from output lists. (Nipype default value: False)

passworda string

Server password.

raise_on_emptya boolean

Generate exception if list is empty for a given field. (Nipype default value: True)

ssh_log_to_filea string

If set SSH commands will be logged to the given file. (Nipype default value: "")

template_argsa dictionary with keys which are a string and with values which are a list of items which are a list of items which are any value

Information to plug into template.

template_expression‘fnmatch’ or ‘regexp’

Use either fnmatch or regexp to express templates. (Nipype default value: fnmatch)

usernamea string

Server username.

SSHDataGrabber.output_spec

alias of nipype.interfaces.base.specs.DynamicTraitedSpec

SelectFiles

Link to code

Bases: IOBase

Flexibly collect data from disk to feed into workflows.

This interface uses Python’s {}-based string formatting syntax to plug values (possibly known only at workflow execution time) into string templates and collect files from persistant storage. These templates can also be combined with glob wildcards (*, ?) and character ranges ([...]). The field names in the formatting template (i.e. the terms in braces) will become inputs fields on the interface, and the keys in the templates dictionary will form the output fields.

Examples

>>> import pprint
>>> from nipype import SelectFiles, Node
>>> templates={"T1": "{subject_id}/struct/T1.nii",
...            "epi": "{subject_id}/func/f[0,1].nii"}
>>> dg = Node(SelectFiles(templates), "selectfiles")
>>> dg.inputs.subject_id = "subj1"
>>> pprint.pprint(dg.outputs.get())  # doctest:
{'T1': <undefined>, 'epi': <undefined>}

Note that SelectFiles does not support lists as inputs for the dynamic fields. Attempts to do so may lead to unexpected results because brackets also express glob character ranges. For example,

>>> templates["epi"] = "{subject_id}/func/f{run}.nii"
>>> dg = Node(SelectFiles(templates), "selectfiles")
>>> dg.inputs.subject_id = "subj1"
>>> dg.inputs.run = [10, 11]

would match f0.nii or f1.nii, not f10.nii or f11.nii.

base_directorya pathlike object or string representing an existing directory

Root path common to templates.

force_listsa boolean or a list of items which are a string

Whether to return outputs as a list even when only one file matches the template. Either a boolean that applies to all output fields or a list of output field names to coerce to a list. (Nipype default value: False)

raise_on_emptya boolean

Raise an exception if a template pattern matches no files. (Nipype default value: True)

sort_filelista boolean

When matching mutliple files, return them in sorted order. (Nipype default value: True)

SelectFiles.output_spec

alias of nipype.interfaces.base.specs.DynamicTraitedSpec

XNATSink

Link to code

Bases: LibraryBaseInterface, IOBase

Generic datasink module that takes a directory containing a list of nifti files and provides a set of structured output fields.

configa pathlike object or string representing a file

Mutually exclusive with inputs: server.

experiment_ida string

Set to workflow name.

project_ida string

Project in which to store the outputs.

servera string

Mutually exclusive with inputs: config. Requires inputs: user, pwd.

subject_ida string

Set to subject id.

_outputsa dictionary with keys which are a string and with values which are any value

(Nipype default value: {})

assessor_ida string

Option to customize ouputs representation in XNAT - assessor level will be used with specified id. Mutually exclusive with inputs: reconstruction_id.

cache_dir : a pathlike object or string representing a directory pwd : a string reconstruction_id : a string

Option to customize ouputs representation in XNAT - reconstruction level will be used with specified id. Mutually exclusive with inputs: assessor_id.

sharea boolean

Option to share the subjects from the original projectinstead of creating new ones when possible - the created experiments are then shared back to the original project. (Nipype default value: False)

user : a string

XNATSource

Link to code

Bases: LibraryBaseInterface, IOBase

Pull data from an XNAT server.

Generic XNATSource module that wraps around the pyxnat module in an intelligent way for neuroimaging tasks to grab files and data from an XNAT server.

Examples

Pick all files from current directory

>>> dg = XNATSource()
>>> dg.inputs.template = '*'
>>> dg = XNATSource(infields=['project','subject','experiment','assessor','inout'])
>>> dg.inputs.query_template = '/projects/%s/subjects/%s/experiments/%s'                '/assessors/%s/%s_resources/files'
>>> dg.inputs.project = 'IMAGEN'
>>> dg.inputs.subject = 'IMAGEN_000000001274'
>>> dg.inputs.experiment = '*SessionA*'
>>> dg.inputs.assessor = '*ADNI_MPRAGE_nii'
>>> dg.inputs.inout = 'out'
>>> dg = XNATSource(infields=['sid'],outfields=['struct','func'])
>>> dg.inputs.query_template = '/projects/IMAGEN/subjects/%s/experiments/*SessionA*'                '/assessors/*%s_nii/out_resources/files'
>>> dg.inputs.query_template_args['struct'] = [['sid','ADNI_MPRAGE']]
>>> dg.inputs.query_template_args['func'] = [['sid','EPI_faces']]
>>> dg.inputs.sid = 'IMAGEN_000000001274'
configa pathlike object or string representing a file

Mutually exclusive with inputs: server.

query_templatea string

Layout used to get files. Relative to base directory if defined.

servera string

Mutually exclusive with inputs: config. Requires inputs: user, pwd.

cache_dira pathlike object or string representing a directory

Cache directory.

pwd : a string query_template_args : a dictionary with keys which are a string and with values which are a list of items which are a list of items which are any value

Information to plug into template. (Nipype default value: {'outfiles': []})

user : a string

XNATSource.output_spec

alias of nipype.interfaces.base.specs.DynamicTraitedSpec

nipype.interfaces.io.add_traits(base, names, trait_type=None)

Add traits to a traited class.

All traits are set to Undefined by default

nipype.interfaces.io.capture_provenance()
nipype.interfaces.io.copytree(src, dst, use_hardlink=False)

Recursively copy a directory tree using nipype.utils.filemanip.copyfile()

This is not a thread-safe routine. However, in the case of creating new directories, it checks to see if a particular directory has already been created by another process.

nipype.interfaces.io.push_file(self, xnat, file_name, out_key, uri_template_args)
nipype.interfaces.io.push_provenance()
nipype.interfaces.io.quote_id(string)
nipype.interfaces.io.unquote_id(string)