mastouille.fr est l'un des nombreux serveurs Mastodon indépendants que vous pouvez utiliser pour participer au fédiverse.
Mastouille est une instance Mastodon durable, ouverte, et hébergée en France.

Administré par :

Statistiques du serveur :

617
comptes actifs

#dynamicalsystems

0 message0 participant0 message aujourd’hui

re-#introduction
Hi Fediscience! I am an Assistant Professor of Mechanical Engineering at University of Hawaiʻi at Mānoa (Honolulu). I got here starting from Physics training with many scientific detours into data-driven models, complex systems, nanomaterial self-assembly, human learning of complex networks, naval ships, and design problems.
I grew up in Belarus and have *opinions* on that region of the world. I've been on Fediverse since late 2022 when *something* happened to our previous cybersocial infrastructure, but the previous server I was on is sunsetting. Please come say hi and recommend cool people to follow here.
I have a blog with longer thoughts on science-adjacent topics.
aklishin.science/blog/
#ComplexSystems #NetworkScience #DataScience #DynamicalSystems #CollectiveBehavior #StatisticalPhysics

Andrei A. Klishin, Ph.D.Blog — Andrei A. Klishin, Ph.D.

A few days back, I posted some #AnimatedGifs of the exact solution for a large-amplitude undamped, unforced #Pendulum. I then thought to complete the study to include the case when it has been fed enough #energy to allow it just to undergo #FullRotations, rather than just #oscillations. Well, it turns out that it is “a bit more complicated than I first expected” but I finally managed it.

Can time series (TS) #FoundationModels (FM) like Chronos zero-shot generalize to unseen #DynamicalSystems (DS)? #AI

No, they cannot!

But *DynaMix* can, the first TS/DS foundation model based on principles of DS reconstruction, capturing the long-term evolution of out-of-domain DS: arxiv.org/pdf/2505.13192v1

Unlike TS foundation models, DynaMix exhibits #ZeroShotLearning of long-term stats of unseen DS, incl. attractor geometry & power spectrum, w/o *any* re-training, just from a context signal.
It does so with only 0.1% of the parameters of Chronos & 10x faster inference times than the closest competitor.

It often even outperforms TS FMs on forecasting diverse empirical time series, like weather, traffic, or medical data, typically used to train TS FMs.
This is surprising, cos DynaMix’ training corpus consists *solely* of simulated limit cycles & chaotic systems, no empirical data at all!

And no, it’s neither based on Transformers nor Mamba – it’s a new type of mixture-of-experts architecture based on the recently introduced AL-RNN (proceedings.neurips.cc/paper_f), specifically trained for DS reconstruction.

Remarkably, DynaMix not only generalizes zero-shot to novel DS, but it can even generalize to new initial conditions and regions of state space not covered by the in-context information.

We dive a bit into the reasons why current time series FMs not trained for DS reconstruction fail, and conclude that a DS perspective on time series forecasting & models may help to advance the #TimeSeriesAnalysis field.

Suite du fil

(10/n) If you’ve made it this far, you’ll definitely want to check out the full paper. Grab your copy here:
biorxiv.org/content/10.1101/20
📤 Sharing is highly appreciated!
#compneuro #neuroscience #NeuroAI #dynamicalsystems

bioRxiv · From spiking neuronal networks to interpretable dynamics: a diffusion-approximation frameworkModeling and interpreting the complex recurrent dynamics of neuronal spiking activity is essential to understanding how networks implement behavior and cognition. Nonlinear Hawkes process models can capture a large range of spiking dynamics, but remain difficult to interpret, due to their discontinuous and stochastic nature. To address this challenge, we introduce a novel framework based on a piecewise deterministic Markov process representation of the nonlinear Hawkes process (NH-PDMP) followed by a diffusion approximation. We analytically derive stability conditions and dynamical properties of the obtained diffusion processes for single-neuron and network models. We established the accuracy of the diffusion approximation framework by comparing it with exact continuous-time simulations of the original neuronal NH-PDMP models. Our framework offers an analytical and geometric account of the neuronal dynamics repertoire captured by nonlinear Hawkes process models, both for the canonical responses of single-neurons and neuronal-network dynamics, such as winner-take-all and traveling wave phenomena. Applied to human and nonhuman primate recordings of neuronal spiking activity during speech processing and motor tasks, respectively, our approach revealed that task features can be retrieved from the dynamical landscape of the fitted models. The combination of NH-PDMP representations and diffusion approximations thus provides a novel dynamical analysis framework to reveal single-neuron and neuronal-population dynamics directly from models fitted to spiking data. ### Competing Interest Statement The authors have declared no competing interest.

Yesterday, I posted an image of the #LorenzAttractor showing the evolution of three trajectories (shown in red, green and blue) starting close together. Here, I’ve made it into a little animation to show how the paths initially stay close to each other but after about a quarter of the duration plotted, they #diverge from each other irrevocably (i.e. become uncorrelated) but remain part of the #ChaoticAttractor.

What is the connection between fractal geometry and systems at a critical point undergoing phase transition? This is one of the more useful ideas that has emerged from the study of dynamical systems, but often it's buried too deep into the study of modeling for most people to encounter it-- then it gets explained badly in pop-science books.

At last here is a video that will set you right:

youtube.com/watch?v=vwLb3XlPCB

A répondu dans un fil de discussion

@ekmiller @DrYohanJohn @dumoulin @bwyble @laurentperrinet @NicoleCRust @achristensen56
so the suspicion I have from this is that traveling waves and other wavelike phenomena like scroll and spiral waves that you see in other excitable dynamical systems are actually way way way more common and way way way more determinative of function that we typically appreciate, but because we can't "untangle" the connectomic topology it plays out on we don't notice them as such and it ends up looking like the quasi-independent salt and pepper activity we usually describe it as. Everyone should read Art Winfree's geometry of biological time for how these dynamical regimes are almost unavoidable in excitable dynamical systems!

But again with the "multiple dynamical regimes at different scales" thing - wavelike phenomena also should happen with much less but still nonzero effect in a quasi-euclidean way via #EphapticCoupling - extracellular space is obvs tightly packed full of resistive tissue and everything so it's not strictly euclidean either, but moreso than the connectomic dynamical manifold.

This is the kinda thing that makes me wish I stayed doing neuroscience, bc I feel like these are sort of inescapable truths of how the brain works - that should be super important for understanding it! - but I have seen almost no work that really takes them seriously (but would love to because I'm sure it's out there)

📢 New preprint out ! 📢
We investigate how and why the disruption of the astrocyte network (through the KO of gap junctions) affects the neuronal bursting phenotype.
Combining experiments and a new mathematical model based on synaptic short-term plasticity we found that the main affected mechanism is afterhyperpolarization through KCNQ channels regulation !

biorxiv.org/content/10.1101/20

bioRxivAstroglial gap junctions strengthen hippocampal network activity by sustaining afterhyperpolarization via KCNQ channelsThroughout the brain, astrocytes form networks mediated by gap-junction channels that promote the activity of neuronal ensembles. Although their inputs on neuronal information processing are well established, how molecularly gap junction channels shape neuronal network patterns remains unclear. Here using astroglial connexin-deficient mice, in which astrocytes are disconnected and neuronal bursting patterns are abnormal, we found that astrocyte networks strengthen bursting activity via dynamic regulation of extracellular potassium levels, independently of glutamate homeostasis or metabolic support. Using a novel facilitation-depression model, we identified neuronal afterhyperpolarization as the key parameter underlying bursting patterns regulation by extracellular potassium in mice with disconnected astrocytes. We confirmed experimentally this prediction, and revealed that astroglial network-control of extracellular potassium sustains neuronal afterhyperpolarization via activation of KCNQ voltage-gated K+ channels. Altogether, these data delineate how astroglial gap-junctions mechanistically strengthen neuronal population bursts, and points to approaches for controlling aberrant activity in neurological diseases. ### Competing Interest Statement The authors have declared no competing interest.

Replication crisis 🔜 Paradigm change?

“A possible lack of tangible improvement after years of struggling to establish reproducible causal effects might well be the signal that it is time for the field of psychology to break the Kuhnian resistance to paradigmatic change, and to embrace what the complex dynamical systems paradigm has to offer.”

New preprint: psyarxiv.com/nbfwe/

#Psychology
#MetaScience
#ReplicationCrisis
#PhilosophyOfScience
#PhilSci
#ComplexSystems
#DynamicalSystems

Just moved to neuromatch.social, so here it goes (again), #introduction :

Hi everyone, I'm a last year undergrad in #Neuroscience & #ComputerScience at McGill. I'm doing #ComputationalNeuroscience research in the Baillet Lab at The Neuro (MNI), focusing on whole-brain dynamical models of coupled neural masses calibrated to #MEG #Neuroimaging data (more details @ neurolife77.github.io/ if anyone is curious).

I am also the VP of the #MachineLearning committee at PharmaHacks, a hackathon that blends #Biology & #DataScience with a focus on #Pharma.
@neuroscience #neurodon

------------------------ Bonus ------------------------

Since I have the space to put it in the same post now, thanks to the freedom in post length from this new server, here's a bonus:

I regularly share links to preprints that catch my attention and tag them with: #arxivfeed

I started doing this because I thought that the arxiv bots on mastodon not super efficient, but after doing it for about a month I'd say it's also a good way to keep some form of history of my nightly exploration of the literature in my fields of interest. I usually share stuff about #ComputationalNeuroscience, #Neuroimaging, #DynamicalSystems, #MachineLeaning, #ArtificialIntelligence, etc.

Disclaimer: I usually only read the abstract or skim through them at the time of posting.
Disclaimer 2: I am definitely not consistent.

neurolife77.github.ioDominic Boutet - WebsiteDominic Boutet personal website. I am a Neuroscience and Computer Science student at McGill University...