Spike Sorting¶
Overview¶
Developer Note: if you may make a PR in the future, be sure to copy this
notebook, and use the gitignore
prefix temp
to avoid future conflicts.
This is one notebook in a multi-part series on Spyglass.
- To set up your Spyglass environment and database, see the Setup notebook
- For additional info on DataJoint syntax, including table definitions and inserts, see the Insert Data notebook
Extract the recording
- Specifying your NWB file.
- Specifying which electrodes involved in the recording to sort data from. -
SortGroup
- Specifying the time segment of the recording we want to sort. -
IntervalList
,SortInterval
- Specifying the parameters to use for filtering the recording. -
SpikeSortingPreprocessingParameters
- Combining these parameters. -
SpikeSortingRecordingSelection
- Extracting the recording. -
SpikeSortingRecording
- Specifying the parameters to apply for artifact detection/removal. -
ArtifactDetectionParameters
Spike sorting the recording
- Specify the spike sorter and parameters to use. -
SpikeSorterParameters
- Combine these parameters. -
SpikeSortingSelection
- Spike sort the extracted recording according to chose parameter set. -
SpikeSorting
Imports¶
Let's start by importing tables from Spyglass and quieting warnings caused by some dependencies.
Note: It the imports below throw a FileNotFoundError
, make a cell with !env | grep X
where X is part of the problematic directory. This will show the variable causing issues. Make another cell that sets this variable elsewhere with %env VAR="/your/path/"
import os
import datajoint as dj
import numpy as np
# change to the upper level folder to detect dj_local_conf.json
if os.path.basename(os.getcwd()) == "notebooks":
os.chdir("..")
dj.config.load("dj_local_conf.json") # load config for database connection info
import spyglass.common as sgc
import spyglass.spikesorting as sgs
# ignore datajoint+jupyter async warnings
import warnings
warnings.simplefilter("ignore", category=DeprecationWarning)
warnings.simplefilter("ignore", category=ResourceWarning)
Fetch Exercise¶
If you haven't already done so, add yourself to LabTeam
name, email, dj_user = "Firstname Lastname", "example@gmail.com", "user"
sgc.LabMember.insert_from_name(name)
sgc.LabMember.LabMemberInfo.insert1(
[name, email, dj_user], skip_duplicates=True
)
sgc.LabTeam.LabTeamMember.insert1(
{"team_name": "My Team", "lab_member_name": name},
skip_duplicates=True,
)
We can try fetch
to confirm.
Exercise: Try to write a fer lines to generate a dictionary with team names as
keys and lists of members as values. It may be helpful to add more data with the
code above and use fetch(as_dict=True)
.
my_team_members = (
(sgc.LabTeam.LabTeamMember & {"team_name": "My Team"})
.fetch("lab_member_name")
.tolist()
)
if name in my_team_members:
print("You made it in!")
You made it in!
Code hidden here
members = sgc.LabTeam.LabTeamMember.fetch(as_dict=True)
teams_dict = {member["team_name"]: [] for member in members}
for member in members:
teams_dict[member["team_name"]].append(member["lab_member_name"])
print(teams_dict)
Adding an NWB file¶
Import Data¶
If you haven't already, load an NWB file. For more details on downloading and importing data, see this notebook.
import spyglass.data_import as sdi
sdi.insert_sessions("minirec20230622.nwb")
nwb_file_name = "minirec20230622_.nwb"
/home/cb/wrk/spyglass/src/spyglass/data_import/insert_sessions.py:41: UserWarning: Cannot insert data from minirec20230622.nwb: minirec20230622_.nwbis already in Nwbfile table. warnings.warn(
Extracting the recording¶
SortGroup
¶
Each NWB file will have multiple electrodes we can use for spike sorting. We
commonly use multiple electrodes in a SortGroup
selected by what tetrode or
shank of a probe they were on.
Note: This will delete any existing entries. Answer 'yes' when prompted.
sgs.SortGroup().set_group_by_shank(nwb_file_name)
[2023-07-21 13:56:24,232][INFO]: Deleting 128 rows from `spikesorting_recording`.`sort_group__sort_group_electrode` [2023-07-21 13:56:24,234][INFO]: Deleting 4 rows from `spikesorting_recording`.`sort_group`
[2023-07-21 13:56:27,358][INFO]: Deletes committed.
Each electrode has an electrode_id
and is associated with an
electrode_group_name
, which corresponds with a sort_group_id
.
For example, data recorded from a 32 tetrode (128 channel) drive results in 128
unique electrode_id
. This could result in 32 unique electrode_group_name
and
32 unique sort_group_id
.
sgs.SortGroup.SortGroupElectrode & {"nwb_file_name": nwb_file_name}
nwb_file_name name of the NWB file | sort_group_id identifier for a group of electrodes | electrode_group_name electrode group name from NWBFile | electrode_id the unique number for this electrode |
---|---|---|---|
minirec20230622_.nwb | 0 | 0 | 0 |
minirec20230622_.nwb | 0 | 0 | 1 |
minirec20230622_.nwb | 0 | 0 | 2 |
minirec20230622_.nwb | 0 | 0 | 3 |
minirec20230622_.nwb | 0 | 0 | 4 |
minirec20230622_.nwb | 0 | 0 | 5 |
minirec20230622_.nwb | 0 | 0 | 6 |
minirec20230622_.nwb | 0 | 0 | 7 |
minirec20230622_.nwb | 0 | 0 | 8 |
minirec20230622_.nwb | 0 | 0 | 9 |
minirec20230622_.nwb | 0 | 0 | 10 |
minirec20230622_.nwb | 0 | 0 | 11 |
...
Total: 128
IntervalList
¶
Next, we make a decision about the time interval for our spike sorting using
IntervalList
.
sgc.IntervalList & {"nwb_file_name": nwb_file_name}
nwb_file_name name of the NWB file | interval_list_name descriptive name of this interval list | valid_times numpy array with start and end times for each interval |
---|---|---|
minirec20230622_.nwb | 01_s1 | =BLOB= |
minirec20230622_.nwb | 02_s2 | =BLOB= |
minirec20230622_.nwb | pos 0 valid times | =BLOB= |
minirec20230622_.nwb | pos 1 valid times | =BLOB= |
minirec20230622_.nwb | pos 2 valid times | =BLOB= |
minirec20230622_.nwb | pos 3 valid times | =BLOB= |
minirec20230622_.nwb | raw data valid times | =BLOB= |
Total: 7
Let's start with the first run interval (01_s1
) and fetch corresponding valid_times
. For the minirec
example, this is relatively short.
interval_list_name = "01_s1"
interval_list = (
sgc.IntervalList
& {"nwb_file_name": nwb_file_name, "interval_list_name": interval_list_name}
).fetch1("valid_times")[0]
def print_interval_duration(interval_list: np.ndarray):
duration = np.round((interval_list[1] - interval_list[0]))
print(f"This interval list is {duration:g} seconds long")
print_interval_duration(interval_list)
This interval list is 10 seconds long
SortInterval
¶
For longer recordings, Spyglass subsets this interval with SortInterval
.
Below, we select the first n
seconds of this interval.
n = 9
sort_interval_name = interval_list_name + f"_first{n}"
sort_interval = np.array([interval_list[0], interval_list[0] + n])
With the above, we can insert into SortInterval
sgs.SortInterval.insert1(
{
"nwb_file_name": nwb_file_name,
"sort_interval_name": sort_interval_name,
"sort_interval": sort_interval,
},
skip_duplicates=True,
)
And verify the entry
print_interval_duration(
(
sgs.SortInterval
& {
"nwb_file_name": nwb_file_name,
"sort_interval_name": sort_interval_name,
}
).fetch1("sort_interval")
)
This interval list is 9 seconds long
Preprocessing Parameters¶
SpikeSortingPreprocessingParameters
contains the parameters used to filter the
recorded data in the spike band prior to sorting.
sgs.SpikeSortingPreprocessingParameters()
preproc_params_name | preproc_params |
---|---|
default | =BLOB= |
Total: 1
Here, we insert the default parameters and then fetch them.
sgs.SpikeSortingPreprocessingParameters().insert_default()
preproc_params = (
sgs.SpikeSortingPreprocessingParameters()
& {"preproc_params_name": "default"}
).fetch1("preproc_params")
print(preproc_params)
{'frequency_min': 300, 'frequency_max': 6000, 'margin_ms': 5, 'seed': 0}
Let's adjust the frequency_min
to 600, the preference for hippocampal data,
and insert that into the table as a new set of parameters for hippocampal data.
preproc_params["frequency_min"] = 600
sgs.SpikeSortingPreprocessingParameters().insert1(
{
"preproc_params_name": "default_hippocampus",
"preproc_params": preproc_params,
},
skip_duplicates=True,
)
Processing a key¶
key is often used to describe an entry we want to move through the pipeline,
and keys are often managed as dictionaries. Here, we'll manage the spike sort
recording key, ssr_key
.
interval_list_name
'01_s1'
ssr_key = dict(
nwb_file_name=nwb_file_name,
sort_group_id=0, # See SortGroup
sort_interval_name=sort_interval_name, # First N seconds above
preproc_params_name="default_hippocampus", # See preproc_params
interval_list_name=interval_list_name,
team_name="My Team",
)
Recording Selection¶
We now insert this key SpikeSortingRecordingSelection
table to specify what
time/tetrode/etc. of the recording we want to extract.
sgs.SpikeSortingRecordingSelection.insert1(ssr_key, skip_duplicates=True)
sgs.SpikeSortingRecordingSelection() & ssr_key
nwb_file_name name of the NWB file | sort_group_id identifier for a group of electrodes | sort_interval_name name for this interval | preproc_params_name | team_name | interval_list_name descriptive name of this interval list |
---|---|---|---|---|---|
minirec20230622_.nwb | 0 | 01_s1_first9 | default_hippocampus | My Team | 01_s1 |
Total: 1
SpikeSortingRecording
¶
And now we're ready to extract the recording! The
populate
command
will automatically process data in Computed or Imported
table tiers.
If we only want to process certain entries, we can grab their primary key with
the .proj()
command
and use a list of primary keys when calling populate
.
ssr_pk = (sgs.SpikeSortingRecordingSelection & ssr_key).proj()
sgs.SpikeSortingRecording.populate([ssr_pk])
write_binary_recording with n_jobs = 8 and chunk_size = 299593
write_binary_recording: 0%| | 0/1 [00:00<?, ?it/s]
Now we can see our recording in the table. E x c i t i n g !
sgs.SpikeSortingRecording() & ssr_key
nwb_file_name name of the NWB file | sort_group_id identifier for a group of electrodes | sort_interval_name name for this interval | preproc_params_name | team_name | recording_path | sort_interval_list_name descriptive name of this interval list |
---|---|---|---|---|---|---|
minirec20230622_.nwb | 0 | 01_s1_first9 | default_hippocampus | My Team | /home/cb/wrk/zOther/data/recording/minirec20230622_.nwb_01_s1_first9_0_default_hippocampus | minirec20230622_.nwb_01_s1_first9_0_default_hippocampus |
Total: 1
Artifact Detection¶
ArtifactDetectionParameters
establishes the parameters for removing artifacts
from the data. We may want to target artifact signal that is within the
frequency band of our filter (600Hz-6KHz), and thus will not get removed by
filtering.
For this demo, we'll use a parameter set to skip this step.
sgs.ArtifactDetectionParameters().insert_default()
artifact_key = (sgs.SpikeSortingRecording() & ssr_key).fetch1("KEY")
artifact_key["artifact_params_name"] = "none"
We then pair artifact detection parameters in ArtifactParameters
with a
recording extracted through population of SpikeSortingRecording
and insert
into ArtifactDetectionSelection
.
sgs.ArtifactDetectionSelection().insert1(artifact_key)
sgs.ArtifactDetectionSelection() & artifact_key
nwb_file_name name of the NWB file | sort_group_id identifier for a group of electrodes | sort_interval_name name for this interval | preproc_params_name | team_name | artifact_params_name | custom_artifact_detection |
---|---|---|---|---|---|---|
minirec20230622_.nwb | 0 | 01_s1_first9 | default_hippocampus | My Team | none | 0 |
Total: 1
Then, we can populate ArtifactDetection
, which will find periods where there
are artifacts, as specified by the parameters.
sgs.ArtifactDetection.populate(artifact_key)
Amplitude and zscore thresholds are both None, skipping artifact detection
Populating ArtifactDetection
also inserts an entry into ArtifactRemovedIntervalList
, which stores the interval without detected artifacts.
sgs.ArtifactRemovedIntervalList() & artifact_key
artifact_removed_interval_list_name | nwb_file_name name of the NWB file | sort_group_id identifier for a group of electrodes | sort_interval_name name for this interval | preproc_params_name | team_name | artifact_params_name | artifact_removed_valid_times | artifact_times np array of artifact intervals |
---|---|---|---|---|---|---|---|---|
minirec20230622_.nwb_01_s1_first9_0_default_hippocampus_none_artifact_removed_valid_times | minirec20230622_.nwb | 0 | 01_s1_first9 | default_hippocampus | My Team | none | =BLOB= | =BLOB= |
Total: 1
Spike sorting¶
SpikeSorterParameters
¶
For our example, we will be using mountainsort4
. There are already some default parameters in the SpikeSorterParameters
table we'll fetch
.
sgs.SpikeSorterParameters().insert_default()
# Let's look at the default params
sorter_name = "mountainsort4"
ms4_default_params = (
sgs.SpikeSorterParameters
& {"sorter": sorter_name, "sorter_params_name": "default"}
).fetch1()
print(ms4_default_params)
{'sorter': 'mountainsort4', 'sorter_params_name': 'default', 'sorter_params': {'detect_sign': -1, 'adjacency_radius': -1, 'freq_min': 300, 'freq_max': 6000, 'filter': True, 'whiten': True, 'num_workers': 1, 'clip_size': 50, 'detect_threshold': 3, 'detect_interval': 10, 'tempdir': None}}
Now we can change these default parameters to line up more closely with our preferences.
sorter_params = {
**ms4_default_params["sorter_params"], # start with defaults
"detect_sign": -1, # downward going spikes (1 for upward, 0 for both)
"adjacency_radius": 100, # Sort electrodes together within 100 microns
"filter": False, # No filter, since we filter prior to starting sort
"freq_min": 0,
"freq_max": 0,
"whiten": False, # Turn whiten, since we whiten it prior to starting sort
"num_workers": 4, # same number as number of electrodes
"verbose": True,
"clip_size": np.int64(
1.33e-3 # same as # of samples for 1.33 ms based on the sampling rate
* (sgc.Raw & {"nwb_file_name": nwb_file_name}).fetch1("sampling_rate")
),
}
from pprint import pprint
pprint(sorter_params)
{'adjacency_radius': 100, 'clip_size': 39, 'detect_interval': 10, 'detect_sign': -1, 'detect_threshold': 3, 'filter': False, 'freq_max': 0, 'freq_min': 0, 'num_workers': 4, 'tempdir': None, 'verbose': True, 'whiten': False}
We can give these sorter_params
a sorter_params_name
and insert into SpikeSorterParameters
.
sorter_params_name = "hippocampus_tutorial"
sgs.SpikeSorterParameters.insert1(
{
"sorter": sorter_name,
"sorter_params_name": sorter_params_name,
"sorter_params": sorter_params,
},
skip_duplicates=True,
)
(
sgs.SpikeSorterParameters
& {"sorter": sorter_name, "sorter_params_name": sorter_params_name}
).fetch1()
{'sorter': 'mountainsort4', 'sorter_params_name': 'hippocampus_tutorial', 'sorter_params': {'detect_sign': -1, 'adjacency_radius': 100, 'freq_min': 0, 'freq_max': 0, 'filter': False, 'whiten': False, 'num_workers': 4, 'clip_size': 39, 'detect_threshold': 3, 'detect_interval': 10, 'tempdir': None, 'verbose': True}}
SpikeSortingSelection
¶
Gearing up to Spike Sort!
We now collect our various keys to insert into SpikeSortingSelection
, which is specific to this recording and eventual sorting segment.
Note: the spike sorter parameters defined above are specific to
mountainsort4
and may not work for other sorters.
ss_key = dict(
**(sgs.ArtifactDetection & ssr_key).fetch1("KEY"),
**(sgs.ArtifactRemovedIntervalList() & ssr_key).fetch1("KEY"),
sorter=sorter_name,
sorter_params_name=sorter_params_name,
)
ss_key.pop("artifact_params_name")
ss_key
{'nwb_file_name': 'minirec20230622_.nwb', 'sort_group_id': 0, 'sort_interval_name': '01_s1_first9', 'preproc_params_name': 'default_hippocampus', 'team_name': 'My Team', 'artifact_removed_interval_list_name': 'minirec20230622_.nwb_01_s1_first9_0_default_hippocampus_none_artifact_removed_valid_times', 'sorter': 'mountainsort4', 'sorter_params_name': 'hippocampus_tutorial'}
sgs.SpikeSortingSelection.insert1(ss_key, skip_duplicates=True)
(sgs.SpikeSortingSelection & ss_key)
nwb_file_name name of the NWB file | sort_group_id identifier for a group of electrodes | sort_interval_name name for this interval | preproc_params_name | team_name | sorter | sorter_params_name | artifact_removed_interval_list_name | import_path optional path to previous curated sorting output |
---|---|---|---|---|---|---|---|---|
minirec20230622_.nwb | 0 | 01_s1_first9 | default_hippocampus | My Team | mountainsort4 | hippocampus_tutorial | minirec20230622_.nwb_01_s1_first9_0_default_hippocampus_none_artifact_removed_valid_times |
Total: 1
SpikeSorting
¶
After adding to SpikeSortingSelection
, we can simply populate SpikeSorting
.
Note: This may take time with longer data sets. Be sure to pip install mountainsort4
if this is your first time spike sorting.
# [(sgs.SpikeSortingSelection & ss_key).proj()]
sgs.SpikeSorting.populate()
Running spike sorting on {'nwb_file_name': 'minirec20230622_.nwb', 'sort_group_id': 0, 'sort_interval_name': '01_s1_first9', 'preproc_params_name': 'default_hippocampus', 'team_name': 'My Team', 'sorter': 'mountainsort4', 'sorter_params_name': 'hippocampus_tutorial', 'artifact_removed_interval_list_name': 'minirec20230622_.nwb_01_s1_first9_0_default_hippocampus_none_artifact_removed_valid_times'}... Mountainsort4 use the OLD spikeextractors mapped with NewToOldRecording Using temporary directory /home/cb/wrk/zOther/data/tmp/tmpr9_xzjwk Using 4 workers. Using tempdir: /home/cb/wrk/zOther/data/tmp/tmpr9_xzjwk/tmpo_xved1i Num. workers = 4 Preparing /home/cb/wrk/zOther/data/tmp/tmpr9_xzjwk/tmpo_xved1i/timeseries.hdf5... Preparing neighborhood sorters (M=31, N=269997)... Neighboorhood of channel 29 has 5 channels.Neighboorhood of channel 28 has 6 channels. Detecting events on channel 29 (phase1)... Detecting events on channel 30 (phase1)... Neighboorhood of channel 23 has 7 channels. Detecting events on channel 24 (phase1)... Neighboorhood of channel 7 has 7 channels. Detecting events on channel 8 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.022525 Num events detected on channel 30 (phase1): 1 Computing PCA features for channel 30 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.025604 Num events detected on channel 29 (phase1): 6 Computing PCA features for channel 29 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.028793 Num events detected on channel 24 (phase1): 5 Computing PCA features for channel 24 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.029814 Num events detected on channel 8 (phase1): 5 Computing PCA features for channel 8 (phase1)... Clustering for channel 30 (phase1)... Found 1 clusters for channel 30 (phase1)... Computing templates for channel 30 (phase1)... Re-assigning events for channel 30 (phase1)... Neighboorhood of channel 17 has 7 channels. Detecting events on channel 18 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.096352 Num events detected on channel 18 (phase1): 5 Computing PCA features for channel 18 (phase1)... Clustering for channel 24 (phase1)... Found 1 clusters for channel 24 (phase1)...Clustering for channel 29 (phase1)... Computing templates for channel 24 (phase1)... Found 1 clusters for channel 29 (phase1)... Computing templates for channel 29 (phase1)... Clustering for channel 8 (phase1)... Found 1 clusters for channel 8 (phase1)... Computing templates for channel 8 (phase1)... Re-assigning events for channel 29 (phase1)...Re-assigning events for channel 24 (phase1)... Neighboorhood of channel 20 has 7 channels. Detecting events on channel 21 (phase1)... Neighboorhood of channel 25 has 7 channels. Detecting events on channel 26 (phase1)... Clustering for channel 18 (phase1)... Found 1 clusters for channel 18 (phase1)... Computing templates for channel 18 (phase1)... Re-assigning events for channel 8 (phase1)... Neighboorhood of channel 26 has 7 channels. Detecting events on channel 27 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.039213 Num events detected on channel 26 (phase1): 4 Computing PCA features for channel 26 (phase1)... Elapsed time for detect on neighborhood:Re-assigning events for channel 18 (phase1)... 0:00:00.055824 Num events detected on channel 21 (phase1): 14 Neighboorhood of channel 14 has 7 channels.Computing PCA features for channel 21 (phase1)... Detecting events on channel 15 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.029223 Num events detected on channel 27 (phase1): 8 Computing PCA features for channel 27 (phase1)... Elapsed time for detect on neighborhood:Clustering for channel 27 (phase1)... Found 1 clusters for channel 27 (phase1)... 0:00:00.086340 Num events detected on channel 15 (phase1): 2 Computing PCA features for channel 15 (phase1)... Computing templates for channel 27 (phase1)... Clustering for channel 26 (phase1)... Found 1 clusters for channel 26 (phase1)... Computing templates for channel 26 (phase1)... Clustering for channel 21 (phase1)... Found 1 clusters for channel 21 (phase1)... Computing templates for channel 21 (phase1)... Re-assigning events for channel 26 (phase1)...Re-assigning events for channel 27 (phase1)... Neighboorhood of channel 4 has 7 channels. Detecting events on channel 5 (phase1)... Re-assigning events for channel 21 (phase1)... Neighboorhood of channel 15 has 7 channels.Neighboorhood of channel 16 has 7 channels. Detecting events on channel 16 (phase1)... Detecting events on channel 17 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.027093 Num events detected on channel 5 (phase1): 18 Computing PCA features for channel 5 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.035564 Num events detected on channel 16 (phase1): 6 Computing PCA features for channel 16 (phase1)... Clustering for channel 5 (phase1)... Found 1 clusters for channel 5 (phase1)... Computing templates for channel 5 (phase1)... Clustering for channel 16 (phase1)... Found 1 clusters for channel 16 (phase1)... Computing templates for channel 16 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.091005 Num events detected on channel 17 (phase1): 17 Computing PCA features for channel 17 (phase1)... Clustering for channel 15 (phase1)...Re-assigning events for channel 5 (phase1)... Found 1 clusters for channel 15 (phase1)... Computing templates for channel 15 (phase1)... Neighboorhood of channel 11 has 7 channels. Detecting events on channel 12 (phase1)... Re-assigning events for channel 16 (phase1)... Neighboorhood of channel 12 has 7 channels. Detecting events on channel 13 (phase1)... Elapsed time for detect on neighborhood:Re-assigning events for channel 15 (phase1)... 0:00:00.040278 Num events detected on channel 12 (phase1): 7 Computing PCA features for channel 12 (phase1)... Neighboorhood of channel 6 has 7 channels. Detecting events on channel 7 (phase1)... Clustering for channel 17 (phase1)... Found 1 clusters for channel 17 (phase1)... Computing templates for channel 17 (phase1)... Re-assigning events for channel 17 (phase1)... Neighboorhood of channel 21 has 7 channels. Detecting events on channel 22 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.033808 Num events detected on channel 7 (phase1): 10 Computing PCA features for channel 7 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.076235 Num events detected on channel 13 (phase1): 1 Computing PCA features for channel 13 (phase1)... Clustering for channel 12 (phase1)... Found 1 clusters for channel 12 (phase1)... Computing templates for channel 12 (phase1)... Clustering for channel 13 (phase1)...Re-assigning events for channel 12 (phase1)... Neighboorhood of channel 0 has 4 channels. Detecting events on channel 1 (phase1)... Found 1 clusters for channel 13 (phase1)... Computing templates for channel 13 (phase1)... Elapsed time for detect on neighborhood:Re-assigning events for channel 13 (phase1)... Neighboorhood of channel 5 has 7 channels. Detecting events on channel 6 (phase1)... 0:00:00.026181 Num events detected on channel 1 (phase1): 3 Computing PCA features for channel 1 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.133353 Num events detected on channel 22 (phase1): 3 Computing PCA features for channel 22 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.025383 Num events detected on channel 6 (phase1): 2 Computing PCA features for channel 6 (phase1)... Clustering for channel 1 (phase1)...Clustering for channel 7 (phase1)... Found 1 clusters for channel 1 (phase1)... Computing templates for channel 1 (phase1)... Found 1 clusters for channel 7 (phase1)... Computing templates for channel 7 (phase1)... Re-assigning events for channel 1 (phase1)... Neighboorhood of channel 13 has 7 channels.Re-assigning events for channel 7 (phase1)... Detecting events on channel 14 (phase1)... Neighboorhood of channel 8 has 7 channels. Detecting events on channel 9 (phase1)... Clustering for channel 6 (phase1)... Found 1 clusters for channel 6 (phase1)... Computing templates for channel 6 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.033709 Num events detected on channel 9 (phase1): 6 Computing PCA features for channel 9 (phase1)... Elapsed time for detect on neighborhood:Re-assigning events for channel 6 (phase1)... 0:00:00.055517 Num events detected on channel 14 (phase1): 4 Computing PCA features for channel 14 (phase1)... Neighboorhood of channel 22 has 7 channels. Detecting events on channel 23 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.020576 Num events detected on channel 23 (phase1): 17 Computing PCA features for channel 23 (phase1)... Clustering for channel 23 (phase1)...Clustering for channel 14 (phase1)... Found 1 clusters for channel 14 (phase1)... Computing templates for channel 14 (phase1)... Clustering for channel 22 (phase1)... Found 1 clusters for channel 22 (phase1)... Computing templates for channel 22 (phase1)... Clustering for channel 9 (phase1)... Found 1 clusters for channel 9 (phase1)... Computing templates for channel 9 (phase1)... Found 1 clusters for channel 23 (phase1)... Computing templates for channel 23 (phase1)... Re-assigning events for channel 14 (phase1)... Re-assigning events for channel 9 (phase1)...Neighboorhood of channel 18 has 7 channels. Detecting events on channel 19 (phase1)... Neighboorhood of channel 9 has 7 channels. Detecting events on channel 10 (phase1)... Re-assigning events for channel 23 (phase1)... Neighboorhood of channel 27 has 7 channels. Detecting events on channel 28 (phase1)... Re-assigning events for channel 22 (phase1)... Neighboorhood of channel 3 has 7 channels. Detecting events on channel 4 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.035256 Num events detected on channel 19 (phase1): 11 Computing PCA features for channel 19 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.040792 Num events detected on channel 10 (phase1): 5 Computing PCA features for channel 10 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.033487 Num events detected on channel 28 (phase1): 4 Computing PCA features for channel 28 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.029246 Num events detected on channel 4 (phase1): 10 Computing PCA features for channel 4 (phase1)... Clustering for channel 28 (phase1)... Found 1 clusters for channel 28 (phase1)... Computing templates for channel 28 (phase1)... Re-assigning events for channel 28 (phase1)... Clustering for channel 10 (phase1)... Found 1 clusters for channel 10 (phase1)... Computing templates for channel 10 (phase1)... Re-assigning events for channel 10 (phase1)... Clustering for channel 19 (phase1)... Found 1 clusters for channel 19 (phase1)... Computing templates for channel 19 (phase1)... Re-assigning events for channel 19 (phase1)... Neighboorhood of channel 1 has 5 channels. Detecting events on channel 2 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.019415 Num events detected on channel 2 (phase1): 3 Computing PCA features for channel 2 (phase1)... Clustering for channel 4 (phase1)... Found 1 clusters for channel 4 (phase1)... Computing templates for channel 4 (phase1)... Clustering for channel 2 (phase1)... Found 1 clusters for channel 2 (phase1)... Computing templates for channel 2 (phase1)... Re-assigning events for channel 2 (phase1)... Neighboorhood of channel 30 has 4 channels. Detecting events on channel 31 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.012684 Num events detected on channel 31 (phase1): 0 Computing PCA features for channel 31 (phase1)... Clustering for channel 31 (phase1)... Found 0 clusters for channel 31 (phase1)... Computing templates for channel 31 (phase1)... Re-assigning events for channel 31 (phase1)... Neighboorhood of channel 24 has 7 channels. Detecting events on channel 25 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.017826 Num events detected on channel 25 (phase1): 14 Computing PCA features for channel 25 (phase1)... Re-assigning events for channel 4 (phase1)... Neighboorhood of channel 19 has 7 channels. Detecting events on channel 20 (phase1)... Clustering for channel 25 (phase1)... Found 1 clusters for channel 25 (phase1)... Computing templates for channel 25 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.022288 Num events detected on channel 20 (phase1): 14 Computing PCA features for channel 20 (phase1)... Re-assigning events for channel 25 (phase1)... Neighboorhood of channel 2 has 6 channels. Detecting events on channel 3 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.018741 Num events detected on channel 3 (phase1): 11 Computing PCA features for channel 3 (phase1)... Clustering for channel 3 (phase1)... Found 1 clusters for channel 3 (phase1)... Computing templates for channel 3 (phase1)... Re-assigning events for channel 3 (phase1)... Neighboorhood of channel 10 has 7 channels. Detecting events on channel 11 (phase1)... Clustering for channel 20 (phase1)... Found 1 clusters for channel 20 (phase1)... Computing templates for channel 20 (phase1)... Re-assigning events for channel 20 (phase1)... Elapsed time for detect on neighborhood: 0:00:00.035092 Num events detected on channel 11 (phase1): 6 Computing PCA features for channel 11 (phase1)... Clustering for channel 11 (phase1)... Found 1 clusters for channel 11 (phase1)... Computing templates for channel 11 (phase1)... Re-assigning events for channel 11 (phase1)... Neighboorhood of channel 17 has 7 channels. Computing PCA features for channel 18 (phase2)... No duplicate events found for channel 17 in phase2 Neighboorhood of channel 25 has 7 channels. Computing PCA features for channel 26 (phase2)... No duplicate events found for channel 25 in phase2 Neighboorhood of channel 4 has 7 channels. Computing PCA features for channel 5 (phase2)... No duplicate events found for channel 4 in phase2 Neighboorhood of channel 29 has 5 channels. Computing PCA features for channel 30 (phase2)... No duplicate events found for channel 29 in phase2 Clustering for channel 30 (phase2)... Found 1 clusters for channel 30 (phase2)... Neighboorhood of channel 8 has 7 channels. Computing PCA features for channel 9 (phase2)... No duplicate events found for channel 8 in phase2 Clustering for channel 18 (phase2)... Found 1 clusters for channel 18 (phase2)... Neighboorhood of channel 24 has 7 channels. Computing PCA features for channel 25 (phase2)... No duplicate events found for channel 24 in phase2 Clustering for channel 5 (phase2)... Found 1 clusters for channel 5 (phase2)... Neighboorhood of channel 1 has 5 channels. Computing PCA features for channel 2 (phase2)... No duplicate events found for channel 1 in phase2 Clustering for channel 9 (phase2)... Found 1 clusters for channel 9 (phase2)... Neighboorhood of channel 14 has 7 channels. Computing PCA features for channel 15 (phase2)... No duplicate events found for channel 14 in phase2 Clustering for channel 2 (phase2)... Found 1 clusters for channel 2 (phase2)... Neighboorhood of channel 12 has 7 channels. Computing PCA features for channel 13 (phase2)... No duplicate events found for channel 12 in phase2 Clustering for channel 26 (phase2)... Found 1 clusters for channel 26 (phase2)... Neighboorhood of channel 27 has 7 channels. Computing PCA features for channel 28 (phase2)... No duplicate events found for channel 27 in phase2 Clustering for channel 15 (phase2)... Found 1 clusters for channel 15 (phase2)... Neighboorhood of channel 28 has 6 channels. Computing PCA features for channel 29 (phase2)... No duplicate events found for channel 28 in phase2 Clustering for channel 13 (phase2)... Found 1 clusters for channel 13 (phase2)... Clustering for channel 29 (phase2)... Found 1 clusters for channel 29 (phase2)... Neighboorhood of channel 16 has 7 channels. Computing PCA features for channel 17 (phase2)... No duplicate events found for channel 16 in phase2 Neighboorhood of channel 13 has 7 channels. Computing PCA features for channel 14 (phase2)... No duplicate events found for channel 13 in phase2 Clustering for channel 17 (phase2)... Found 1 clusters for channel 17 (phase2)... Neighboorhood of channel 26 has 7 channels. Computing PCA features for channel 27 (phase2)... No duplicate events found for channel 26 in phase2 Clustering for channel 27 (phase2)...Clustering for channel 14 (phase2)... Found 1 clusters for channel 14 (phase2)... Found 1 clusters for channel 27 (phase2)... Neighboorhood of channel 18 has 7 channels. Computing PCA features for channel 19 (phase2)... No duplicate events found for channel 18 in phase2 Neighboorhood of channel 11 has 7 channels. Computing PCA features for channel 12 (phase2)...Clustering for channel 28 (phase2)... No duplicate events found for channel 11 in phase2 Found 1 clusters for channel 28 (phase2)... Neighboorhood of channel 15 has 7 channels. Computing PCA features for channel 16 (phase2)... No duplicate events found for channel 15 in phase2 Clustering for channel 25 (phase2)... Found 1 clusters for channel 25 (phase2)... Neighboorhood of channel 9 has 7 channels. Computing PCA features for channel 10 (phase2)... No duplicate events found for channel 9 in phase2 Clustering for channel 12 (phase2)... Found 1 clusters for channel 12 (phase2)... Neighboorhood of channel 10 has 7 channels. Computing PCA features for channel 11 (phase2)... No duplicate events found for channel 10 in phase2 Clustering for channel 19 (phase2)... Found 1 clusters for channel 19 (phase2)... Neighboorhood of channel 3 has 7 channels. Computing PCA features for channel 4 (phase2)... No duplicate events found for channel 3 in phase2 Clustering for channel 11 (phase2)... Found 1 clusters for channel 11 (phase2)... Neighboorhood of channel 6 has 7 channels. Computing PCA features for channel 7 (phase2)... No duplicate events found for channel 6 in phase2 Clustering for channel 4 (phase2)... Found 1 clusters for channel 4 (phase2)... Neighboorhood of channel 23 has 7 channels. Computing PCA features for channel 24 (phase2)... No duplicate events found for channel 23 in phase2 Clustering for channel 7 (phase2)... Found 1 clusters for channel 7 (phase2)... Neighboorhood of channel 22 has 7 channels. Computing PCA features for channel 23 (phase2)... No duplicate events found for channel 22 in phase2 Clustering for channel 23 (phase2)...Clustering for channel 16 (phase2)... Found 1 clusters for channel 16 (phase2)... Found 1 clusters for channel 23 (phase2)... Neighboorhood of channel 5 has 7 channels. Computing PCA features for channel 6 (phase2)... No duplicate events found for channel 5 in phase2 Neighboorhood of channel 20 has 7 channels. Computing PCA features for channel 21 (phase2)... No duplicate events found for channel 20 in phase2 Clustering for channel 10 (phase2)... Found 1 clusters for channel 10 (phase2)... Neighboorhood of channel 2 has 6 channels. Computing PCA features for channel 3 (phase2)... No duplicate events found for channel 2 in phase2 Clustering for channel 24 (phase2)... Found 1 clusters for channel 24 (phase2)... Clustering for channel 21 (phase2)... Found 1 clusters for channel 21 (phase2)... Neighboorhood of channel 21 has 7 channels. Computing PCA features for channel 22 (phase2)... No duplicate events found for channel 21 in phase2 Clustering for channel 6 (phase2)... Found 1 clusters for channel 6 (phase2)... Neighboorhood of channel 7 has 7 channels. Clustering for channel 22 (phase2)... Found 1 clusters for channel 22 (phase2)... Computing PCA features for channel 8 (phase2)... No duplicate events found for channel 7 in phase2 Clustering for channel 3 (phase2)... Found 1 clusters for channel 3 (phase2)... Neighboorhood of channel 0 has 4 channels. Computing PCA features for channel 1 (phase2)... No duplicate events found for channel 0 in phase2 Clustering for channel 1 (phase2)... Found 1 clusters for channel 1 (phase2)... Neighboorhood of channel 30 has 4 channels. Computing PCA features for channel 31 (phase2)... No duplicate events found for channel 30 in phase2 Clustering for channel 31 (phase2)... Found 0 clusters for channel 31 (phase2)... Clustering for channel 8 (phase2)... Found 1 clusters for channel 8 (phase2)... Neighboorhood of channel 19 has 7 channels. Computing PCA features for channel 20 (phase2)... No duplicate events found for channel 19 in phase2 Clustering for channel 20 (phase2)... Found 1 clusters for channel 20 (phase2)... Preparing output... Done with ms4alg. Cleaning tempdir::::: /home/cb/wrk/zOther/data/tmp/tmpr9_xzjwk/tmpo_xved1i mountainsort4 run time 5.69s Saving sorting results...
/home/cb/miniconda3/envs/spy/lib/python3.9/site-packages/spikeinterface/core/basesorting.py:212: UserWarning: The registered recording will not be persistent on disk, but only available in memory warnings.warn("The registered recording will not be persistent on disk, but only available in memory") /home/cb/miniconda3/envs/spy/lib/python3.9/tempfile.py:821: ResourceWarning: Implicitly cleaning up <TemporaryDirectory '/home/cb/wrk/zOther/data/tmp/tmpr9_xzjwk'> _warnings.warn(warn_message, ResourceWarning)
Check to make sure the table populated¶
sgs.SpikeSorting() & ss_key
nwb_file_name name of the NWB file | sort_group_id identifier for a group of electrodes | sort_interval_name name for this interval | preproc_params_name | team_name | sorter | sorter_params_name | artifact_removed_interval_list_name | sorting_path | time_of_sort in Unix time, to the nearest second |
---|---|---|---|---|---|---|---|---|---|
minirec20230622_.nwb | 0 | 01_s1_first9 | default_hippocampus | My Team | mountainsort4 | hippocampus_tutorial | minirec20230622_.nwb_01_s1_first9_0_default_hippocampus_none_artifact_removed_valid_times | /home/cb/wrk/zOther/data/"sorting"/minirec20230622_.nwb_01_s1_first9_0_default_hippocampus_3335c236_spikesorting | 1689971050 |
Total: 1
Next Steps¶
Congratulations, you've spike sorted! See our next notebook for curation steps.