Brainhack Marseille

November 28 - November 30 2022

Register

To register to the event, please click on the button below to fill the form. The deadline for registration is Friday November the 18th .
Registration is now CLOSED.
Note that there is a limitation of 60 people on site and that the event will be light-hybrid (unconference + some projects).
For late registration, you should anyway fill the form (press on the grey 'REGISTER' button) and send an email to the organization team.
Lunch and social event are not guaranteed for late registrations.

Program

Please find bellow the program of the Brainhack event (timezone Paris UTC+1). If you want to add the following schedule to your own personal Google agenda, click on the "Save the dates" button and then click on the "+ Google agenda" located at the bottom right of your screen.


28th November

09h00-10h15 Introduction to BrainHack Marseille 2022 (In-person/Online)
  • 09h00-09h15 - Introduction to the event
  • 09h15-10h15 - Projects pitches
10h15-12h30 Hacking and/or Training:
  • 11h00-11h45 - Open tools for open science By Simon Moré, Manuel Mercier & ArnaudLe Troter
  • 11h45-12h00 - Reproducible science By Guillaume Auzias
12h30-14h00 Break
14h00-17h30 Hacking and/or Training:
  • 14h30-15h00 - Github basic By David Meunier
  • 15h00-15h30 - Github advanced By David Meunier
  • 15h30-16h00 - Introduction to Matlab By Manuel Mercier
  • 16h00-16h30 - Introduction to Python* By Julia Sprenger
  • 16h30-17h00 - Python advanced* By Julia Sprenger
*Python required: To participate in this session bring a laptop with a Python installation and wifi access. For this follow the regular install instructions for your operating systems here. We recommend installing Miniconda. You should be able to open an Anaconda prompt and typing `print('hello world')` should display 'hello word' as a response. Tell us ahead of the session if you need further assistance with setting up Miniconda. The session will give an overview of the basic concepts in Python (finding your way in Python, basic data types, variables, advanced types, loops, etc...)
17h30-> BHM social event!

29th November

09h00-09h30 Breakfast
09h30-12h30 Hacking and/or Training:
  • 10h00-11h00 - Tools for data science & data analysis
    • - MNE by Ruggero Basanisi
      - FieldTrip by Manuel Mercier
      - FreeSurfer by Guillaume Auzias
      - Scikit-Learn by Matthieu Gilson
  • 11h00-12h00 - Open source softwares (round table)
12h30-14h00 Break
14h00-15h00 Hacking
15h00-16h00 How to archive code with software heritage (In-person/Online) by Sabrina Granger
16h00-17h00 Hacking and/or Training:
  • 16h00-17h00 Training course about "FAIR4RS & SWH" by Julien Caugant
17h00-17h15 Break
17h15-18h30 What Neurosciences In The Anthropocene Era? A participatory workshop (In-person) By Daniele Schön & Julien Lefevre

30th November

09h00-09h30 Breakfast
09h30-12h30 Hacking
12h30-14h00 Break
14h00-16h30 Hacking
16h30-17h30 Wrapping up about the projects - closing the BHM 20222 (In-person/Online)

Projects

Here you can find all the informations about the event projects.
If you want to submit a project you should follow the link, fill the form, and open a github issue. Projects can be anything you'd like to work on during the event with other people (coding, discussing a procedure with coworkers, brainstorming about a new idea), as long as you're ready to minimally organize this!



Automatise your processing pipelines with nipype / pydra

by David Meunier


Neuroimaging and electrophysiology processing requires many steps, calling different softwares, possibly in different languages (typically, matlab batches or shell scripts).
Nipype has provided an integrative solution, with a sufficient level of complexity to cover most of the needs for writting pipelines in neuroimaging. It is based on the notion workflows, being an orderd succession of nodes, linking inputs and outputs. Nodes can be user-written function (in python), interfaces with existing softwares (e.g. FSL, AFNI or SPM), or even other user-defined sub-workflows.
Nipype is at the base of many widely used docker images, such as fmriprep and qsiprep. And has been extendend for other applications, such as EEG/MEG processing (ephypype), graph analysis in functional connectivity (graphpype) or non-human primate anatomical MRI segmentation (macapype).
Nipype has now achieved a degree of maturity to have become predominant in the community. But some of the limitations still prevails. It has decided in the last years to rewrite the core engine of nipype, to incorporate new functionnalities, such as runnnig containers as one node. The new implementation will be called pydra, and also still in its infancy, we expect it to become a major standard in the community.

In this project, we propose:
  • To give an overview of how nipype works
  • To advise you if it is useful for your typical processing
  • To help writting specific nodes or workflows dedicated to your processing

For advaced users, We also propose:
  • To explain the advances of pydra compaired to nipype
  • To write some tools existing in nipype in pydra

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Neuroimaging/electrophysiology processing 100%
Shell Script / Matlab Batch 75%
Python 50%
Nipype 25%

16s Metagenomic analysis of Gut Microbiome

by Dipankar Bachar


16S rRNA gene sequencing is commonly used for identification, classification and quantitation of microbes within complex biological mixtures such as environmental samples and gut samples. Many studies [1][2] are trying to establish the link between the intestinal microbiota with different neurodevelopment, neurogical disorders etc. In these studies the BioInformatic analysis of the data coming from gut samples plays a very important role. So, in our brainhack project we will work on the publicly available data from one of these article[1]. So, this project is aimed for those who are interested in the BioInformatic analysis of Gut Microbiota and are also interested in the different pipelines and analysis tools like QIIME2[4] , cutadapt[5] , Kraken2[6] etc.

In this project, we propose:
  • To go through this [1] article to understand their method of analyzing the gut microbiota
  • To download their 16s sequencing data from NCBI (submitted by the authors)
  • To extract and filter the data

For advaced users, We also propose:
  • To write( Python ) few tools for filtering the sequencing data
  • To try to create an Anaconda environment or a Singularity image with the 16s analysis pipeline QIIME2[4]
  • To try to implement some of the methods of analysis with QIIME2 (as described in the article)
  • To understand the importance of reproducibility of the results

References
[1] The microbial metabolite p-Cresol induces autistic-like behaviors in mice by remodeling the gut microbiota. Patricia Bermudez-Martin et al.
[2] Gut Microbiota Regulate Motor Deficits and Neuroinflammation in a Model of Parkinson's Disease. Timothy R Sampson et al.
[3] 16S rRNA Gene Sequencing for identification, classification and quantitation of microbes
[4] Reproducible, interactive, scalable and extensible microbiome data science using QIIME 2. Evan Bolye et al.
[5] Cutadapt
[6] Kraken 2

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Anaconda or Singularity 50%
Python 50%
Unix / Linux 25%
Knowledge of 16s sequencing 25%

LFP and single units decoding in rats performing a distance estimation task

by Fabrizio Capitano, Pierre-Yves Jacob, Celia Laurent & Francesca Sargolini


Navigating in space requires to continuously updating the representation of its own position in the environment. In poorly cued environments (e.g. in darkness) this can only be done by estimating the direction and the distance in order to compute a vectorial integration of the movement (so called path-integration). In a set of studies in rats we demonstrated that the entorhinal cortex plays a fundamental role in path integration. However, how the circuit infers the running distance is yet to be known. Using a distance estimation task, we discovered that the theta oscillations of the local field potential and the firing pattern of single neurons in the entorhinal cortex correlate with the running distance estimated by the rats. But how much and how well the unitary and the field activity allow the circuit to infer about the travelled distance? In the context of the BrainHack Marseille 2022, we would like to interact and collaborate with people with a good knowledge of neural decoding of EEG-LFP and/or unitary activity in order to implement proper methods to decode the estimated running distance starting from a dataset of electrophysiology recordings in behaving rats. The final goal will be to quantify the information carried by population and single units activity within the entorhinal circuit.
Here and here you’ll find some of our previous studies to have more insight about the neuroscientific framework of the project.

Goals for the BrainHack:
  • Discuss about pros and cons of different decoding methods
  • Sort-out the most appropriate one(s) for our project
  • Start implementing the analysis on the dataset
  • More in general, seeding future collaborations

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Neural decoding 100%
EEG/LFP and single units 80%
Spatial cognition 50%

SLAM - A pure python package for Surface anaLysis And Modeling

by Guillaume Auzias & Alexandre Pron


Slam is an open source python package dedicated to the representation of neuroanatomical surfaces stemming from MRI data in the form of triangular meshes and to their processing and analysis. Main features include read/write gifti (and nifti) file format, geodesic distance computation, several implementations of graph Laplacian and Gradient, mesh surgery (boundary identification, large hole closing), several types of mapping between the mesh and a sphere, a disc... Have a look at the examples on the documentation website.

Goals for Brainhack:
  • To improve the documentation of basic core functions and modules (e.g. the curvature.py module)
  • To increase unit test coverage and quality of basic core functions and modules
  • To speed-up specific pieces of code such as the computation of the curvatures
  • To help potential users to get familiar with this python package depending on their use cases

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Share ideas and good time 100%
Code documentation 80%
Scientific Python 20%
GitHub 10%
Meshes 10%

Viewing the world in third-person - experimental considerations for a successful study

by Jérémy Faggion


  • I am conceiving an experiment to study the implications of living with a third-person point of view on mental states and self-awareness. For self-awareness implications, see for instance research from Olaf Blanke. I expect positive implications for users, including increased self awareness and increased focus, and interesting implications for users undergoing temporary stress or anxiety. If the issue of learning to associate somatosensory information to the new way of seeing the world can be solved, this could open the door to many more experiments around metacognition, but also around learning involving perception and movement of the body. Inspired by the study of Hubert Dolezal in 1982, using prism glasses to experience the world upside-down.
  • Visuo-tactile association in third-person point of view is hard to achieve. The challenge revolves around how to make the brain learn that association. One candidate approach is to start from a first-person point of view and gradually transition to a third-person point of view, using a camera transmitting its video feed to a headset worn by the user. Existing solutions exist for drone piloting (FPV goggles).
  • Identify the bottlenecks to come. Propose a design solution to allow for a gradual transition from first-person to third-person point of view. Experiment with FPV goggles and bring them to the Brainhack :) The outcome of this project should be in the form of schematics, prototypes, pseudocode, people to reach out to and a list of resources to take inspiration from.
Resources:
  • https://petapixel.com/2014/07/02/custom-built-oculus-rift-gopro-rig-lets-experience-life-third-person/
  • https://www.youtube.com/watch?v=anE3RNf_3s0


Goals for Brainhack:
  • The outcome of this project should be in the form of schematics, pseudocode, people to reach out to and a list of resources to take inspiration from.
  • Iteration cycles should be as short as possible.
  • Testing should ideally happen on-site during the hackathon.

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Sensorimotor association 40%
Engineering/Coding 30%
Fast iteration mindset 20%
VR 10%

Neural encoding of acoustic features during speech and music perception: traduction of matlab code into python

by Benjamin Morillon, Bruno Giordano, Giorgio Marinato, Nadège Marin & Arnaud Zalta


The main goal of this project is to translate a Matlab code into Python. The code allows to perform Representational Similarity Analyses (RSA) cross-validated analyses to estimate the similarity between acoustic features of an auditory stream (speech, music) and neural activity (here intracranial EEG recordings decomposed into frequency bands). The code will be made available on a GitHub page (github.com/DCP-INS) and will be widely used by the DCP team of INS.

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Sensorimotor association 40%
Engineering/Coding 30%
Fast iteration mindset 20%
VR 10%

Guidelines for tractography in primates

by Melina Cordeau, David Meunier, Julien Sein & Arnaud Le Troter


Create guidelines i.e. summarize all the important things to know, step by step, to make a good tractogram edition for non-human primate species. A document that would bring together all the parameters that are similar and those that differ between species for preprocessing and tractography. We will focus on marmosets, macaques, baboons and chimpanzees.

Goals for Brainhack:
  • Definition of guidelines based on human best practices
  • Creation of a parameter file for each primate species
  • Implementation of a corresponding Wiki in PRIME-RE platform

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Be friendly, tolerant, and inclusive 100%
DWI data skills 75%
Python coding 25%

Inferring task-related higher-order interactions from brain network signals

by Etienne Combrisson, Andrea Brovelli, Daniele Marinazzo, Matteo Neri & Ruggero Basanisi


A central hypothesis in neuroscience posits that cognitive functions emerge from complex interactions between multiple brain regions. Similarly, cognitive deficits due to trauma or neurological conditions, such as after stroke, crucially depend on network-level alterations that disrupt normal interactions among multiple brain areas. Although central, progress towards testing these hypotheses has been limited by the lack of approaches for studying interactions between multiple brain regions beyond pairwise relations, the so-called higher-order interactions (HOIs). The aim of our project is to build a novel approach based on recent advances in information theory (the O-information metric) to infer task- or condition-specific HOIs (functional HOIs) from brain signals. We will explore the possibility to combine O-information estimates with permutation-based statistics implemented in Frites.

The main goal of this BrainHack is to have a working first version of the task-related HOI:
  • Prototype the main function (i.e. define input and output types, write down important internal steps)
  • Make it works in the non-dynamic case
  • Investigate the use of Jax to speed up computations
  • Be able to simulate data with a known amount of redundancy and synergy

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Computational 80%
Information-theory 60%
Math 50%
Python 70%

Automatic detection of spiking motifs in neurobiological data

by Matthieu Gilson, Laurent Perrinet, Hugo Ladret & George Abitbol


The study of spatio-temporal correlated activity patterns is very active in several fields related to neuroscience, like machine learning in vision (Muller Nat Rev Neurosci 2018) and neuronal representations and processing (Shahidi Nat Neurosci 2019). This project aims to develop a method for the automated detection of repeating spiking motifs, possibly noisy, in ongoing activity. A diversity of formalizations and detection methods have been proposed and we will focus on several example measures for event/spike trains, to be compared on both synthetic and real data.
An implementation could be based on autodifferentiable networks as implemented in Python libraries like pytorch. This framework allows for the tuning of parameters with specific architectures like convolutional layers that can capture various timescales in spike patterns (e.g. latencies) in an automated fashion. Another recent tool based on the estimation of firing probability for a range of latencies has been proposed (Grimaldi ICIP 2022). This will be compared with existing approaches like Elephant’s SPADE or decoding techniques based on computed statistics computed on smoothed spike trains (adapted from time series processing, see (Lawrie, biorxiv).
One part concerns the generation of realistic synthetic data producing spike trains which include spiking motifs with specific latencies or comodulation of firing rate. The goal is to test how these different structures, which rely on specific assumptions about e.g. stationarity or independent firing probability across time, can be captured by different detection methods.
Bring you real data to analyze them! We will also provide data from electrophysiology.

Goals for Brainhack:
  • Code to generate various models of synthetic data (time series of spikes/events) with embedded patterns
  • Knowledge in signal processing & high-order statistics (correlation)
  • Tool for quantitative comparison of detection methods for correlated patterns

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Interest in analyzing spiking activity 100%
Python (numpy, scikit-learn, pytorch) 70%
Data (spike trains, event time series) 50%
Sharing concepts and ideas 40%

BIDS, electrophysiological data, open data

by Sylvain Takerkart & Julia Sprenger


We are currently developing a standard for organizing electrophysiological data recorded in animal models. It consists in extending the BIDS standard so that it supports this new data modality, using either the NIX or NWB data format. This project is not a classical project, it is just a way to say that we're available to discuss all this during Brainhack Marseille (and also Brainhack Global)! So come and chat with us if you're an electrophysiologist interested in open science (or a python developer with some spare time).

Goals for Brainhack:
  • Chatting, discussing, making progress!

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Electrophysiology 80%
Open science and data management 50%
Python (optional) 20%

PsychophyGit - a fast and flexible tool for online experiments

by Hugo Ladret, Jean-Nicolas Jérémie & Laurent Perrinet


Acquiring human behavioural data through the Internet is on the rise, especially since the COVID-19 events. Often most, researchers looking for a way to set up online experiments will find themselves either using a) complex and tool-specific paradigms or b) to develop their own web code to suit their specific needs. For this Brainhack project, we aim to develop a fast and flexible tool which would allow anyone to perform custom psychophysical investigations through the web. For their convenience, users will be generating their own stimuli locally, uploading them into a GitHub repo, and we will develop a tool which will allow reliable online procedures to be deployed using their stimuli.

Goals for Brainhack:
  • Develop Python code that allows to deploy psychophysical experiments with light requirements
  • Develop a server-side interface which parses GitHub repos to get experimental stimuli
  • Create simple 2 Alternative Forced Choice tasks

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Python 70%
Javascript 50%
Psychophysics 30%

Brainhack projects database

by Rémi Gau


Problem
The brainhack has been running for more than 10 years, yet we do not have single centralised resource to show in a fairly exhaustive manner the diversity of projects that have happened over the past decade.
Having a quick way to create reports about the success of brainhack could make significantly easier for event organizer to look for funding.

Solution
In the past few years more and more events have started listing their projects as github issues.
This now makes it easier to:
  • start creating a mini database of:
    • brainhack events
    • brainhack sites
    • brainhack projects
  • to create an interactive dashboard to query and create visualizations of that database

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Python 50%
GitHub 50%
Databases / APIs / interactive dashboards 50%

Sulcilab: A collaborative sulci labelling web app

by Bastien Cagna



Sulcilab is a web base application to mannually labelize sulcal graphs (outputs of BrainVISA).
I propose this project with 2 goals:
  • Get feedback from beta users
  • Propose to develop a javascript based package to provide a 2D/3D viewer for neuroimaging
How to get started?
  • Ask me to create an account for you and then go to http://babalab.fr:3000 to test the app.
Where to find key resources?

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Have a computer 100%
Python 70%

Removing pink noise (1/f) in LFP across different electrophysiological recording systems

by Laura López Galdo, Shrabasti Jana, Camila Losada, Hasnae Agouram, Cléo Schoeffel & Nilanjana Nandi



We will try to remove the aperiodic noise found in the spectrum of the LFP data using the fooof module (https://github.com/fooof-tools/fooof). We will parametrize our signals, the aperiodic and periodic components and make some comparison across the different frequency bands.
We have data from EEG human recordings, LFP monkey utah array recordings and monkey laminar intracranial recordings across different areas. We want to see the effect of pink noise in each of the different setups and try to clean the signal.
Some related literature can be found in the following links:

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Electrophysiological data 100%
Python 70%
Sharing ideas 60%
literature review 30%

Decoding the bulk signal from fiber photometry during rat behavior

by Maya Williams, Mickael Degoult & Christelle Baunez



I am collecting calcium signals from Infalimbic, Prelimbic, and Anterior cingulate projection terminals at the subthalamic nucleus (STN) using fiber photometery. The goal is to look at the signal from these regions during sucrose and cocaine self administration, to discover the role of these pathways in food vs cocaine intake. In addition, we would focus on discrete behavior during the trails ( lever pressing, reward delivery, error responses) to study the signals of these pathways during both sucrose and cocaine intake.
It will be the first time the hyperdirect pathway is studied in such detail, looking at the mulitpule cortical projection the STN and comparing their roles in food vs cocaine rewards. From perovious work in the lab, we know that STN lesions have opposite effects on cocaine and food intake, increasing food intake while decreasing cocaine intake. This study will explore the much less studies hyperdirect pathway and its role in addiction like behaviors so that the STN can be consided a target for theraputic treatment in addiction, without harming a persons natural motivations for food.
I have collected data from sucrose taking rats, and addapted an open souce code to fit my data. I next need help on further analysis, combinding results into groups, and doing statistical analysis.
Some codes I have used or wish to try:

Required skills

This is a multimodal project, a mix of coding and non-coding skills are required:
Matlab 80%
TFD* 20%
* Teaching For Dummies

Team

Sylvain Takerkart

Research Engineer

David Meunier

Research Engineer

Dipankar Bachar

Research Engineer

Julia Sprenger

Engineer

Melina Cordeau

PhD student

Alexandre Pron

Research Engineer

Simon Moré

Engineer

Etienne Combrisson

Postdoc

Guillaume Auzias

Researcher

Matthieu Gilson

Junior Professor

Manuel Mercier

Research Engineer

Arnaud Le Troter

Research Engineer

Contact

Location:

Faculté de Médecine, 27 Boulevard Jean Moulin, 13005 Marseille