The io module

This module will be the gateway of all the input/output relations in NeuroTools, especially regarding the inferface with pyNN. This is in that module that you’ll have the Standard Formats currently supported by NeuroTools (text and pickle, hdf5 planned in a near future), and if you want to implement your own load function, reading your own particular data structure for the signals module, you should read the documentation

File Handlers

A File handler is an abstract object that will have to implement some key methods in order to be able to read and write NeuroTools objects from a file (given in the constructor). The idea is that is you want to design your own File handler, you just have to implement the abstract methods of the objects, i.e write() (to write an object to a file), read_spikes(params) read data and return a SpikeList object and read_analogs(params, type), read data and returns an analog signal according to type. To have a better understanding, just have a look to the two file handlers implemented in NeuroTools, i.e StandardTextFile and StandPickleFile.

The StandardTextFile class

Creation

The StandardTextFile inherits from FileHandler

Here is an example of creating simple StandardTextFile objects:

>>> textfile = StandardTextFile("test.txt")

Usage

If you want to read a data file with spikes, and return a SpikeList object:

>>> spklist = textfile.read_spikes({'id_list' :range(11), 't_start' : 0, 't_stop' : 1000})

More generally, the read_spikes() method of an object inheriting from FileHandler accepts arguments like id_list, t_start, t_stop, which are the one used in the SpikeList constructor. Note that the StandardTextFile object have private functions for an internal use only that will check/read informations in the headers of the text file, ... See io.py for a deeper understanding of its behavior.

Similar syntax is used for reading a analog signal object:

>>> aslist = textfile.read_analogs('vm', {'id_list':range(11)})

In the case of an AnalogSignal, the type here, selected in [vm, conductance, current] will specified the type of the NeuroTools object returned by the function. Either a VmList, ConductanceList or CurrentList

It you want to save an object to a file, just do:

>>> textfile.write(object)

objet can be a SpikeList or any kind of AnalogSignalList.

The StandardPickleFile class

Creation

The StandardPickleFile also inherits from FileHandler

Here is an example of creating simple StandardPickleFile objects:

>>> pickfile = StandardPickleFile("test.pick")

Usage

If you want to read a data file with spikes, and return a SpikeList object:

>>> spklist = pickfile.read_spikes({'id_list' : range(11), 't_start' : 0, 't_stop' : 1000})

Since this object inherits from FileHandler, the idea is that its behavior is exactly the same than the StandardTextFile. Similar syntax is used for reading a analog signal object:

>>> aslist = pickfile.read_analogs('vm', {'id_list' : range(11)})

In the case of an AnalogSignal, the type here, selected in [vm, conductance, current] will specified the type of the NeuroTools object returned by the function. Either a VmList, ConductanceList or CurrentList

It you want to save an object to a file, just do:

>>> pickfile.write(object)

objet can be a SpikeList or any kind of AnalogSignalList.

The YOURStandardFormatFile class

As said before, you just have to implement some key functions, as defined in the FileHandler:

>>> class YOURStandardFormatFile(FileHandler):
        def write(self, object):
            ### Your method here #########
            ### Should save an object to the file self.filename###

        def read_spikes(self, params):
            ### Your method here, reading data from self.filename #########
            ### Should read data and return a SpikeList object constrained by params
            from NeuroTools import signals
            return signals.SpikeList(...)

        def read_analogs(self, type, params):
            if not type in ["vm", "current", "conductance"]:
                raise Exception("The type %s is not available for the Analogs Signals" %type)
            ### Your method here reading data from self.filename #########
            from NeuroTools import signals
            if type == 'vm':
                return signals.VmList(...)
            elif type == 'conductance':
                return signals.ConductanceList(...)
            elif type == 'current':
                return signals.CurrentList(...)

Data Handlers

The data handler is just a file input/output manager. This is just an interface for load/save functions. This is this kind of object which is created by all the load methods of NeuroTools.signals

The DataHandler class

You should not have to deal directly with this class, because this is just an interface. See io.py for more details

Autodoc

NeuroTools.io

A collection of functions to handle all the inputs/outputs of the NeuroTools.signals file, used by the loaders.

Classes

FileHandler - abstract class which should be overriden, managing how a file will load/write
its data
StandardTextFile - object used to manipulate text representation of NeuroTools objects (spikes or
analog signals)
StandardPickleFile - object used to manipulate pickle representation of NeuroTools objects (spikes or
analog signals)
NestFile - object used to manipulate raw NEST file that would not have been saved by pyNN
(without headers)

DataHandler - object to establish the interface between NeuroTools.signals and NeuroTools.io

All those objects can be used with NeuroTools.signals

>> data = StandardTextFile(“my_data.dat”) >> spikes = load(data,’s’)
class NeuroTools.io.DataHandler(user_file, object=None)[source]

Class to establish the interface for loading/saving objects in NeuroTools

Inputs:
filename - the user file for reading/writing data. By default, if this is
string, a StandardTextFile is created

object - the object to be saved. Could be a SpikeList or an AnalogSignalList

Examples:
>> txtfile = StandardTextFile(“results.dat”) >> DataHandler(txtfile) >> picklefile = StandardPickelFile(“results.dat”) >> DataHandler(picklefile)
load_analogs(type, **params)[source]

Read an AnalogSignalList object from a file and return the AnalogSignalList object of type type, created from the File and from the additional params that may have been provided

type can be in [“vm”, “current”, “conductance”]

Examples:

>> params = {‘id_list’ : range(100), ‘t_stop’ : 1000} >> handler.load_analogs(“vm”, params)

VmList Object (with params taken into account)
>> handler.load_analogs(“current”, params)
CurrentList Object (with params taken into account)
See also
AnalogSignalList, load_spikes
load_spikes(**params)[source]

Function to load a SpikeList object from a file. The data type is automatically infered. Return a SpikeList object

Inputs:
params - a dictionnary with all the parameters used by the SpikeList constructor
Examples:

>> params = {‘id_list’ : range(100), ‘t_stop’ : 1000} >> handler.load_spikes(params)

SpikeList object
See also
SpikeList, load_analogs
save()[source]

Save the object defined in self.object with the method os self.user_file

Note that you can add your own format for I/O of such NeuroTools objects

class NeuroTools.io.FileHandler(filename)[source]

Class to handle all the file read/write methods for the key objects of the signals class, i.e SpikeList and AnalogSignalList. Could be extented

This is an abstract class that will be implemented for each format (txt, pickle, hdf5) The key methods of the class are:

write(object) - Write an object to a file read_spikes(params) - Read a SpikeList file with some params read_analogs(type, params) - Read an AnalogSignalList of type type with some params
Inputs:
filename - the file name for reading/writing data

If you want to implement your own file format, you just have to create an object that will inherit from this FileHandler class and implement the previous functions. See io.py for more details

read_analogs(type, params)[source]

Read an AnalogSignalList object from a file and return the AnalogSignalList object of type type, created from the File and from the additional params that may have been provided

type can be in [“vm”, “current”, “conductance”]

Examples:

>> params = {‘id_list’ : range(100), ‘t_stop’ : 1000} >> handler.read_analogs(“vm”, params)

VmList Object (with params taken into account)
>> handler.read_analogs(“current”, params)
CurrentList Object (with params taken into account)
read_spikes(params)[source]

Read a SpikeList object from a file and return the SpikeList object, created from the File and from the additional params that may have been provided

Examples:

>> params = {‘id_list’ : range(100), ‘t_stop’ : 1000} >> handler.read_spikes(params)

SpikeList Object (with params taken into account)
write(object)[source]

Write the object to the file.

Examples:
>> handler.write(SpikeListObject) >> handler.write(VmListObject)
class NeuroTools.io.NestFile(filename, padding=0, with_time=False, with_gid=True)[source]
get_data(sepchar='t', skipchar='#')[source]

Load data from a text file and returns a list of data

read_analogs(type, params)[source]
read_spikes(params)[source]

Read a SpikeList object from a file and return the SpikeList object, created from the File and from the additional params that may have been provided

Examples:

>> params = {‘id_list’ : range(100), ‘t_stop’ : 1000} >> handler.read_spikes(params)

SpikeList Object (with params taken into account)
write(object)[source]

Write the object to the file.

Examples:
>> handler.write(SpikeListObject) >> handler.write(VmListObject)
class NeuroTools.io.PyNNNumpyBinaryFile(filename)[source]
read_spikes(params)[source]
class NeuroTools.io.StandardPickleFile(filename)[source]
read_analogs(type, params)[source]
read_spikes(params)[source]
write(object)[source]
class NeuroTools.io.StandardTextFile(filename)[source]
get_data(sepchar='t', skipchar='#')[source]

Load data from a text file and returns an array of the data

read_analogs(type, params)[source]
read_spikes(params)[source]
write(object)[source]