Code of Conduct
As all Brainhacks, BrainHackMarseille is dedicated to providing a harassment-free Brainhack experience for everyone,
regardless of gender, gender identity and expression, sexual orientation, disability, physical appearance, body size, race, age or religion.
We do not tolerate harassment of event participants in any form.
Sexual language and imagery is not appropriate for any event venue, including talks.
Event participants violating these rules may be sanctioned or expelled from the event.
Register
Sorry :-( Registration is now CLOSED.
Note that there is a limitation of 80 people on site and that the event will be light-hybrid (unconference + some projects).
For late registration, you should anyway fill the form (press on the grey 'REGISTER' button) and send an email to the organization team.
Lunch and social event are not guaranteed for late registrations.
To register to the event, please click on the button below to fill the form.
Program
Monday 4th December
09h00-10h15 | Welcome to BrainHack Marseille 2023 (In-person/Online)
|
---|---|
10h15-13h00 | Training session:
|
13h00-14h00 | Lunch Break | 14h00-15h20 | Sharing expertise session
|
15h20-15h50 | Coffee Break |
15h50-17h10 |
Sharing expertise on opensource projects:
|
17h10-18h00 | Short projects presentations between Marseille and Lucca |
18h00 | BHM social event! / Speed project dating |
Tuesday 5th December
09h00-13h00 | Project Work
|
---|---|
13h00-14h00 | Lunch Break |
14h00-18h00 | Project Work
|
18h00-19h00 | Round table on UsageRight in Scientific Research
|
Wednesday 6th December
09h00-13h00 | Project Work
|
---|---|
13h00-14h00 | Lunch Break | 14h00-18h00 | Project Work
|
Projects
Here you can find all the informations about the event projects.
If you want to submit a project you should follow the link, fill the form, and open a github issue. Projects can be anything you'd like to work on during the event with other people (coding, discussing a procedure with coworkers, brainstorming about a new idea), as long as you're ready to minimally organize this!
MYOnset: a Python package to detect EMG onset for electrophysiological studies
by Laure Spieser & Boris Burle
Laboratoire de Neurosciences Cognitives, Aix-Marseille University, CNRS
Among brain’s functions, selecting and executing actions is certainly one of the most important. In this research domain, investigating electromyographic (EMG) activity of muscles involved in actions execution can be an easy way to collect more information on processes of interest. Yet, once EMG is recorded, one needs to process and analyse EMG data in addition to other collected data (e.g., behavior, electrophysiological recordings, etc). Particularly, the detection of EMG bursts onsets is often a critical processing step. However, few tools are available to achieve it, and none was really suitable to use in typical experimental designs of experimental psychology such as reaction time tasks. To meet this need, we developed MYOnset, a Python package designed to help such EMG recordings processing, with particular attention given to the step of EMG bursts onsets and offsets detection.
MYOnset integrates tools for standard preprocessing of EMG recordings, like bipolar derivation and filtering. Regarding EMG onset detection, MYOnset proposes a two-steps method: first, an automatic detection of EMG bursts onsets and offsets, second, a step of visualization and manual correction of detected onsets and offsets. MYOnset integrates two algorithms combining different automatic detection methods. Further, MYOnset proposes a specific window for the visualization and manual correction step, which the most time-consuming step and for which no tool was available. This window offers an adapted view for EMG signals and the associated markers, i.e., experimental triggers and EMG onsets and offsets automatically detected. Importantly, user can interact with onset and offset markers to adjust onsets/offsets positions, insert new onsets/offsets, and remove existing onsets/offsets.
MYOnset package is available on PyPI and GitHub.
Link to project repository/sources:Goals for the BrainHack:
- discuss package organisation
- implement new detection methods (e.g., bayesian changepoint detection)
- add code testing
Required skills
This is a data visualisation and physiology project, non-coding skills are required:Surf(ac)ing fMRI data
by Matthieu Gilson, Julien Sein, Jean-Luc Anton, Andrea Bagante, Martin Szinte
mattermost ID: @matgilson , @julien.sein, @jl-anton, @andreabag
The goal of this project is to combine tools in a pipeline for surface-based analysis of fMRI data. Surface-based analysis is a powerful way to align data from different subjects and datasets ( nature , science).
Join us to test tools that will help you to analyze your own fMRI data at the whole-brain level!
The pipeline will combine open-science tools like fMRIprep, Workbench (from HCP), nilearn (Python library). We will provide a couple of subject data to benchmark the tools; they will be formatted in the BIDS format, which is a standard to share data.
Experience in Python is recommended.
You should install a Python distribution like Anaconda beforehand (https://anaconda.org/),we may also use MRI viewer like mango (https://mangoviewer.com/) and tools from Workbench (https://humanconnectome.org/software/connectome-workbench).
Link to project repository/sources:To be announced
Goals for the BrainHack:- contribute to benchmarking of open-source tools in fMRI analysis
- contribute to promoting sharable open-source tools in local neuroscientific community, beyond the computational community
- MRI data manipulation (including BIDS format)
- fMRI preprocessing (fmriprep, workbench)
- decoding (nilearn)
- issue one: tutorial of nilearn on surface analysis
- ssue two: find a good issue...
Required skills
This is a coding project, basic git skills are required:Building models that interpret neuroimaging data
by Marmaduke Woodman
We are writing a new implementation of whole brain models oriented towards recent machine learning algorithms. This implementation is for students & post docs who will come up with tomorrow's theory of brain (dys)function and need better tools for doing so.
Our project is 🦄🦄🦄 because we are building fine-grained models of neural dynamics in entire cohorts where current whole brain models only maps coarse-grained statistics.
The package is being developed at https://github.com/ins-amu/vbjax and includes neural mass and field models, forward models for MEG/fMRI and some data fitting examples.
For the brain hack we will use data from HCP MEG with Brainstorm based preprocessing; scripts at https://github.com/maedoc/friedchicken, but this is not so much the focus of the project.
More resources on the background of the modeling is available at https://thevirtualbrain.org and https://www.ebrains.eu/tools/the-virtual-brain.
Link to project repository/sources:https://github.com/ins-amu/vbjax
Goals for the BrainHack:- discuss use cases with potential users, even those unfamiliar with modeling
- help new users install and run examples
- extend existing set of models
- write new examples for data users have already prepared
- test deep neural network for more flexible time series modeling
- Generate Sphinx docs ins-amu/vbjax#11
- Add tests for partial observed node dynamics ins-amu/vbjax#29
- Add colored noise option ins-amu/vbjax#7
nothing for the moment
Skills- Brainstorming use cases, data features & models 50%
- Python coding & debugging 50%
https://github.com/ins-amu/vbjax#readme
What will participants learn?- how to run a whole-brain simulation
- how to use Jax & NumPyro, potentiall w/ GPUs
- how to optimize a model to fit data
- how to do Bayesian MCMC to find parameters consistent with data
We are using mostly data from HCP https://db.humanconnectome.org,
- preprocessed connectomes and BOLD from EBRAINS (https://search.kg.ebrains.eu)
- MEG w/ Brainstorm based preprocessing (https://github.com/maedoc/friedchicken)
2
Credit to collaboratorshttps://github.com/ins-amu/vbjax/graphs/contributors
Development status
2_releases_existing
Topicbayesian_approaches, causality, connectome, deep_learning, machine_learning, neural_networks, reproducible_scientific_methods, statistical_modelling, systems_neuroscience
ToolsBrainstorm, Jupyter, MNE, other
Programming languagecontainerization, documentation, Matlab, Python
ModalitiesMEG, other
Required skills
This is a brainstorming and coding project, Python skills are required:NARPS Open Pipelines - A codebase to study variability of fMRI analysis workflows
by Boris Clénet - R&D Engineer, Empenn Team, INRIA Rennes
Mattermost : @bclenet
The goal of the NARPS Open Pipelines project is to create a codebase reproducing the 70 pipelines of the NARPS study (Botvinik-Nezer et al., 2020) and share this as an open resource for the community.
We hope this tools will help analysing and understanding variability of fMRI analysis workflows, hence participating in the reproducible research movement.
Find relevant information about how to get started is in the README.md file.
Join us and contribute to an open-source tool for the community !
>Link to project repository/sourceshttps://github.com/Inria-Empenn/narps_open_pipelines
Goals for the BrainHack:- start new pipeline reproductions
- contribute to already stared pipelines
- proof read the documentation
- contribute to the documentation
- write tests for existing pipelines
- Contribute to the documentation (give feedback, help organizing)
- Write the pseudo-code for a pipeline
- Debug already implemented pipelines
General information can be found here: README file
How to contribute: CONTRIBUTING file
- using the nipype python library
- lots of fMRI analysis workflow examples
- good practices for (python) coding
Although it may not be useful during the brainhack, the project's documentation (see corresponding section) gives information about required data.
Number of collaborators4
Credit to collaboratorsAll project contributors are listed in the Credits section of the project.
documentation, pipeline_development
Development status1_basic structure
Topicreproducible_scientific_methods, statistical_modelling
ToolsAFNI, ANTs, BIDS, Datalad, fMRIPrep, FSL, Nipype, SPM
Programming languagedocumentation, Python
ModalitiesfMRI
Required skills
This is a brainstorming and coding project, Python skills are required:PTVR – a visual perception software in Python to make virtual reality experiments easier to build and more reproducible
by Eric Castet & Pierre Kornprobst
Carlos Aguilar
PTVR is a free and open-source library for creating visual perception experiments in virtual reality using high-level Python script programming. It aims at helping behavioral science researchers/engineers leverage virtual reality’s power for their research without the need to learn how virtual reality programming works. The philosophy of PTVR is thus very close to the approach of PsychoPy (https://www.psychopy.org/) that has been so important and influential since 2007 for Vision Science researchers displaying stimuli on 2D monitors.
The PTVR experiments are run in a VR headset making it easy to perform experiments in schools, hospitals, etc....
Having your experiment written in a PTVR script is very powerful in terms of reproducibility of science, Notably, this means that using this script allows anyone to integrally reproduce your experiment provided they have installed PTVR and possess a compatible VR headset.
Two types of sessions will be organized by groups of 4 persons:
a/ In the first type of session, PTVR features will be exemplified thanks to demo scripts. In this case, participants will simply discover PTVR by being subjects of simple experiments and by looking at the corresponding code. This should help participants to see whether PTVR might allow them to run some VR experiments they have in mind.
In this case, you do need to prepare anything in advance.
b/ In the second type of session, participants will try and write simple PTVR scripts
In this case, it would be helpful to :
- install a Python IDE like Spyder
- and install PTVR
You can find extensive information on the PTVR website: https://ptvr.inria.fr/
Link to project repository/sourceshttps://gitlab.inria.fr/PTVR_Public/PTVR_Researchers
Goals for the BrainHack:For sessions of type b/ : deliverables will be new experiments or new demo scripts or new pedagogic scripts written by participants. They will be shared in this github repository:
https://github.com/ericcastet/Brainhack-Marseille-VR
-
Read the Documentation of the PTVR website, especially the 'User Manual' section.
-
issue two: if you already have a VR headset, install PTVR and test its demo scripts.
In sessions of type a/, Participants will experience VR experiments created with PTVR and will be able to assess whether the code is appropriate for them.
In sessions of type b/, Participants will be beta users to help us assess if PTVR code is accessible for non specialists in VR programming.
Although it may not be useful during the brainhack, the project's documentation (see corresponding section) gives information about required data.
Number of collaborators1
Credit to collaboratorsFor sessions of type b/ (see above), participants will be listed as beta users / tester in the PTVR website : about -> people.
Development status2_releases_existing
Topicreproducible_scientific_methods
Programming languagePython
ModalitiesfMRI
Required skills
This is a brainstorming and coding project, Python skills are required:Across-scales Higher-Order neural interdependencies
Andrea Brovelli (https://twitter.com/BrovelliAndrea)
Etienne Combrisson (https://twitter.com/kNearNeighbors)
Collaborators
Thomas Robiglio (https://twitter.com/thomrobiglio)
Matteo di Volo (https://sites.google.com/view/matteodivolo/home)
How can we study the role of higher-order neural interdependencies within and across scales in the brain? Do perception and cognitive arise from higher-order neural interdependencies? Higher-order neural interdependencies (HOIs) are defined as interactions involving more than two neurons, neural populations or brain regions. Recently, the BraiNets team has developed a new tool for higher-order analyses on neural time series (https://github.com/brainets/hoi). The metrics are based on recent advances in information theory and network science, combined with efficient optimisation software (JAX). Participants from all backgrounds are welcome. Novices in the field may be able to familiarise with the metrics and Python toolbox. Advanced participants may take the opportunity to add novel metrics, functionalities, high-level scripts and optimisation tools. All participants may bring their own dataset and/or explore simulated data of spiking neural networks and networks of mean-field signals provided. The project has a GitHub repository (https://github.com/brainets/acrho), where we will share simulated data and the outcomes of the BrainHack as Jupyter Notebooks.
Link to project repository/sourceshttps://github.com/brainets/hoi
https://github.com/brainets/acrho
Share the theoretical and computational tools for higher-order analysis of neural data. Develop novel functionalities and script for optimised analysis on large datasets.
Good first issues- issue one: Read the documentation of the HOI toolbox https://brainets.github.io/hoi/
- issue two: Read important papers cited in the HOI documentation
- issue two: Explore the dataset containing simulations in the https://github.com/brainets/acrho repository
- issue two: Try writing Notebooks to perform HOI analyses on the dataset
https://framateam.org/int-brainets/channels/brainhack_acrho_2023
What will participants learn?Beginners: 1) learn the basic information theoretical notions and metrics; 2) run analysis on simulated data and extract HOIs.
Advanced: 1) contribute with novel metrics; 2) develop example Jupyter Notebooks with showcases on different types of brain data (resting-state fMRI, task-related MEG/iEEG)
The project contains two sets of data.
In the first dataset (dataset 1) we simulated the activity of a biologically realistic spiking neural network composed of excitatory and inhibitory neurons. We recorded spike times, that can be uploaded through the jupyter notebook Read_spike_train.ipynb. We simulated the response to two different stimuli of different amplitude. This was done for a homogeneous network and for a network including cell-to-cell diversity in inhibitory neurons (see di Volo & Destexhe, Sci Rep 2021). There is a file README to help the reading of the data.
In the second dataset (dataset 2) we employed a 1D spatially extended model of connected mean field models with anatomical connectivity following data in primary visual cortex. We study the response to external stimulation of two different amplitudes. Also for this case, we collected data for mean fields of homogeneous neurons and for a model including cell-to-cell diversity (see di Volo & Destexhe, Sci Rep 2021). A jupyter notebook helps reading data and plotting the spatio-temporal response of the model.
3
Credit to collaboratorsContributors will be listed in the Github repository https://github.com/brainets/acrho
Typecoding_methods, method_development, pipeline_development
Development status0_concept_no_content
Topicinformation_theory, neural_encoding, neural_networks, systems_neuroscience, other
Programming languagePython
Modalitiesbehavioral, ECOG, EEG, fMRI, MEG
Git skills1_commit_push
Required skills
This is a brainstorming and coding project, Python skills are required:Team
David Meunier
Research Engineer
Dipankar Bachar
Research Engineer
Julia Sprenger
Engineer
Melina Cordeau
PhD student
Simon Moré
Engineer
Matthieu Gilson
Junior Professor
Manuel Mercier
Research Associate
Arnaud Le Troter
Research Engineer
Laurie Mifsud
PhD student
Caroline Strube
Research Coordinator
Christelle Zielinski
Data Analysis Engineer
Hugo Dary
Research Engineer
Marie Bourzeix
PhD student
Social network