Jekyll2023-04-07T14:16:21+00:00https://maguilera.net/feed.xmlMiguel Aguileracomplex systems, neuroscience and cognitionMiguel AguileraIkerbasque Research Fellow position2022-10-01T00:00:00+00:002022-10-01T00:00:00+00:00https://maguilera.net/blog/BCAM<p>I’m thrilled to announce that I’m starting a new position as an <a href="https://calls.ikerbasque.net/images/stories/2022_ikerbasque_research_fellowships_cofund_wolfram2.pdf">Ikerbasque Research Fellow</a> at the <a href="http://www.bcamath.org/en/">Basque Center for Applied Mathematics</a>, where I will join the <a href="http://www.bcamath.org/en/research/lines/MCEN">Mathematical, Computational and Experimental Neuroscience</a> team.</p>
<p>I’m really excited to start this new period in a position that will allow to start new long-term research lines and projects! My <a href="https://www.ikerbasque.net/eu/miguel-aguilera">research line</a> will include nonequilibrium aspects of neural computation, information thermodynamics of autonomous agents and research at the interface of statistical physics, complex systems and enactive cognition.</p>Miguel AguileraI’m thrilled to announce that I’m starting a new position as an Ikerbasque Research Fellow at the Basque Center for Applied Mathematics, where I will join the Mathematical, Computational and Experimental Neuroscience team.How particular is the physics of the free energy principle?2022-03-17T00:00:00+00:002022-03-17T00:00:00+00:00https://maguilera.net/blog/How-particular-FEP<p>In a <a href="https://www.sciencedirect.com/science/article/pii/S1571064521000749">new paper</a> in Physics of Life Reviews I present a deep mathematical exploration of the <a href="https://en.wikipedia.org/?curid=39403556">free energy principle</a> (FEP) in collaboration with Beren Millidge, Alexander Tschantz and Chris Buckley. The FEP has gained an important momentum as an ambitious theory aiming to provide a unifying theory of mind and life, explained in terms of approximate Bayesian inference. However, there is not many examples of applicaitons of the theory to specific systems, and <a href="https://www.mdpi.com/1099-4300/23/3/293/htm">recent technical critiques</a> had questioned the genereality of its underlying assumptions.</p>
<p>By analytically solving a family of linear nonequilibrium systems we explored the necessary steps to derive the claims of the FEP, finding to main problems of the theory in its current form:</p>
<ol>
<li>Some of the assumptions about the required statistical properties of the target system (e.g. Markov blankets) arise only for very narrow regions of the parameter space. Thus, the plausibility of the FEP is dependent on the assumption that biological systems can operate in a non-trivial regime.</li>
<li>The main result of the FEP, that describes the behaviour of a system as minimizing a variational free energy, is in general a poor description of the behaviour of a system as it ignores the history of interactions (i.e. the previous trajectory) of a system, as it relies on an implicit equivalence between the dynamics of the average states of a system with the average of the dynamics of those states.</li>
</ol>
<p>These issues suggest that the claims of the theory should be taken with caution, and that more development is needed before it can be applied to understand living and cognitive processes.</p>
<p>Aguilera, M, Millidge, B, Tschantz, A & Buckley CL (2022). <a href="https://www.sciencedirect.com/science/article/pii/S1571064521000749">How particular is the physics of the free energy principle?</a>. <em>Physics of Life Reviews</em> 40:24-50; <a href="https://doi.org/10.1016/j.plrev.2021.11.001">https://doi.org/10.1016/j.plrev.2021.11.001</a></p>
<p><strong>Abstract</strong></p>
<p>The free energy principle (FEP) states that any dynamical system can be interpreted as performing Bayesian inference upon its surrounding environment. Although, in theory, the FEP applies to a wide variety of systems, there has been almost no direct exploration or demonstration of the principle in concrete systems. In this work, we examine in depth the assumptions required to derive the FEP in the simplest possible set of systems – weakly-coupled non-equilibrium linear stochastic systems. Specifically, we explore (i) how general the requirements imposed on the statistical structure of a system are and (ii) how informative the FEP is about the behaviour of such systems. We discover that two requirements of the FEP – the Markov blanket condition (i.e. a statistical boundary precluding direct coupling between internal and external states) and stringent restrictions on its solenoidal flows (i.e. tendencies driving a system out of equilibrium) – are only valid for a very narrow space of parameters. Suitable systems require an absence of perception-action asymmetries that is highly unusual for living systems interacting with an environment. More importantly, we observe that a mathematically central step in the argument, connecting the behaviour of a system to variational inference, relies on an implicit equivalence between the dynamics of the average states of a system with the average of the dynamics of those states. This equivalence does not hold in general even for linear stochastic systems, since it requires an effective decoupling from the system’s history of interactions. These observations are critical for evaluating the generality and applicability of the FEP and indicate the existence of significant problems of the theory in its current form. These issues make the FEP, as it stands, not straightforwardly applicable to the simple linear systems studied here and suggest that more development is needed before the theory could be applied to the kind of complex systems that describe living and cognitive processes.</p>Miguel AguileraIn a new paper in Physics of Life Reviews I present a deep mathematical exploration of the free energy principle (FEP) in collaboration with Beren Millidge, Alexander Tschantz and Chris Buckley. The FEP has gained an important momentum as an ambitious theory aiming to provide a unifying theory of mind and life, explained in terms of approximate Bayesian inference. However, there is not many examples of applicaitons of the theory to specific systems, and recent technical critiques had questioned the genereality of its underlying assumptions.A unifying framework for mean-field theories of asymmetric kinetic Ising systems2021-02-19T00:00:00+00:002021-02-19T00:00:00+00:00https://maguilera.net/blog/Unifying-mean-field<p>I’m happy to see published this <a href="https://www.nature.com/articles/s41467-021-20890-5">new paper</a> in collaboration with Amin Moosavi and Hideaki Shimazaki in Nature Communications. In this paper study and unify different kinetic mean-field methods in Ising models with the objective of developing tools for studying large data-sets from networks of neurons in non-equilibrium conditions and near critical and maximally fluctuating regimes. We propose a framework that integrates previous methods under an information geometric perspective. This framework also also allows us to propose new methods under atypical assuptions for mean-field methods.</p>
<p>The University of Sussex’s Media Relations team has been so kind to write <a href="https://www.eurekalert.org/pub_releases/2021-02/uos-mso021621.php">this nice piece</a> on our paper, describing the importance developing methods for studying and modelling the activity of neurons in non-equilibrium conditions and near the edge of chaos.</p>
<p>Aguilera, M, Moosavi, SA & Shimazaki H (2021). <a href="https://www.nature.com/articles/s41467-021-20890-5">A unifying framework for mean-field theories of asymmetric kinetic Ising systems</a>. <em>Nature Communications</em> 12:1197; <a href="https://doi.org/10.1038/s41467-021-20890">https://doi.org/10.1038/s41467-021-20890</a></p>
<p><strong>Abstract</strong></p>
<p>Kinetic Ising models are powerful tools for studying the non-equilibrium dynamics of complex systems. As their behavior is not tractable for large networks, many mean-field methods have been proposed for their analysis, each based on unique assumptions about the system’s temporal evolution. This disparity of approaches makes it challenging to systematically advance mean-field methods beyond previous contributions. Here, we propose a unifying framework for mean-field theories of asymmetric kinetic Ising systems from an information geometry perspective. The framework is built on Plefka expansions of a system around a simplified model obtained by an orthogonal projection to a sub-manifold of tractable probability distributions. This view not only unifies previous methods but also allows us to develop novel methods that, in contrast with traditional approaches, preserve the system’s correlations. We show that these new methods can outperform previous ones in predicting and assessing network properties near maximally fluctuating regimes.</p>Miguel AguileraI’m happy to see published this new paper in collaboration with Amin Moosavi and Hideaki Shimazaki in Nature Communications. In this paper study and unify different kinetic mean-field methods in Ising models with the objective of developing tools for studying large data-sets from networks of neurons in non-equilibrium conditions and near critical and maximally fluctuating regimes. We propose a framework that integrates previous methods under an information geometric perspective. This framework also also allows us to propose new methods under atypical assuptions for mean-field methods.Critical integration in neural and cognitive systems: Beyond power-law scaling as the hallmark of soft assembly2021-01-30T00:00:00+00:002021-01-30T00:00:00+00:00https://maguilera.net/blog/Critical-Integration<p>I just published a <a href="https://www.sciencedirect.com/science/article/pii/S0149763421000233">new paper</a> with Ezequiel Di Paolo in Neuroscience & Biobehavioral Reviews. Here we review the link of self-organized criticality and integrated information with ideas about soft assembly in neural and cognitive process. Studying the properties of power-law scaling processes as a in indicator of properties like soft-assembly, self-organization or interaction-dominant-dynamics is a suggestive approach for the understanding of cognitive process from a complex systems perspective. However, critics have suggested that these approaches operate mostly at the level of analogy or metaphor between real phenomena and idealized toy models. We suggest that this issue can be resolved by exploring what specific kinds of criticality we should expect from cognitive agents, and exploring a particular type of criticality related with integrated information theory, as a a system’s susceptibility to changes in its own integration <a href="https://doi.org/10.1016/j.neunet.2019.03.001">(Aguilera & Di Paolo, 2019)</a>. We find that identifying this critical integration is more informative than power-law measures about the underlying processes of a system (e.g. agent-environment asymmetries, robust of sensorimotor interaction).</p>
<p>Aguilera, M & Di Paolo, EA (2021). <a href="https://www.sciencedirect.com/science/article/pii/S0149763421000233">Critical integration in neural and cognitive systems: Beyond power-law scaling as the hallmark of soft assembly</a>. <em>Neuroscience & Biobehavioral Reviews</em> 123; https://doi.org/10.1016/j.neubiorev.2021.01.009</p>
<p><strong>Abstract</strong></p>
<p>Inspired by models of self-organized criticality, a family of measures quantifies long-range correlations in neural and behavioral activity in the form of self-similar (e.g., power-law scaled) patterns across a range of scales. Long-range correlations are often taken as evidence that a system is near a critical transition, suggesting interaction-dominant, softly assembled relations between its parts. Psychologists and neuroscientists frequently use power-law scaling as evidence of critical regimes and soft assembly in neural and cognitive activity. Critics, however, argue that this methodology operates at most at the level of an analogy between cognitive and other natural phenomena. This is because power-laws do not provide information about a particular system’s organization or what makes it specifically cognitive. We respond to this criticism using recent work in Integrated Information Theory. We propose a more principled understanding of criticality as a system’s susceptibility to changes in its own integration, a property cognitive agents are expected to manifest. We contrast critical integration with power-law measures and find the former more informative about the underlying processes.</p>Miguel AguileraI just published a new paper with Ezequiel Di Paolo in Neuroscience & Biobehavioral Reviews. Here we review the link of self-organized criticality and integrated information with ideas about soft assembly in neural and cognitive process. Studying the properties of power-law scaling processes as a in indicator of properties like soft-assembly, self-organization or interaction-dominant-dynamics is a suggestive approach for the understanding of cognitive process from a complex systems perspective. However, critics have suggested that these approaches operate mostly at the level of analogy or metaphor between real phenomena and idealized toy models. We suggest that this issue can be resolved by exploring what specific kinds of criticality we should expect from cognitive agents, and exploring a particular type of criticality related with integrated information theory, as a a system’s susceptibility to changes in its own integration (Aguilera & Di Paolo, 2019). We find that identifying this critical integration is more informative than power-law measures about the underlying processes of a system (e.g. agent-environment asymmetries, robust of sensorimotor interaction).New website2020-10-02T00:00:00+00:002020-10-02T00:00:00+00:00https://maguilera.net/blog/new-website<p>Welcome to my new Jekyll website. I’ll migrate most of the content from the old one, which is still available at <a href="https://maguilera0.wordpress.com">https://maguilera0.wordpress.com</a>.</p>Miguel AguileraWelcome to my new Jekyll website. I’ll migrate most of the content from the old one, which is still available at https://maguilera0.wordpress.com.DIMENSIVE project2020-06-29T00:00:00+00:002020-06-29T00:00:00+00:00https://maguilera.net/blog/DIMENSIVE-project<p>I’m happy to share that I have been granted a <a href="https://ec.europa.eu/research/mariecurieactions/actions/individual-fellowships_en">MSCA-IF grant</a> to start a project at the <a href="https://www.sussex.ac.uk/">University of Sussex</a> in collaboration with <a href="https://christopherlbuckley.com/">Chris Buckley</a>.
The project aims to apply information-theoretic and inference methods to develop models of neural activity from zebrafish larvae in closed loop behavior, e.g. trying to apply theoretical methods for approximating the behaviour of very large networks and inferring their parameters from experimental data.</p>
<h2 id="dimensive-data-driven-inference-of-models-from-embodied-neural-systems-in-vertebrate-experiments">DIMENSIVE: Data-driven Inference of Models from Embodied Neural Systems In Vertebrate Experiments</h2>
<p>A major challenge in cognitive neuroscience is to understand how behaviour arises from the dynamical interaction of an organism’s nervous system, its body, and its environment. Understanding embodied neural activity involves the resolution of various conceptual, technical and methodological issues in explaining how living organisms self-organize at many levels (from neural bio-chemistry to behaviour and learning). Currently, two important obstacles hinder this endeavour: the difficulties in recording neural activity in behaving animals, and the lack of mathematical tools to characterize the complex brain-body-environment interactions in living organisms. In this project we will address current limits by implementing an interdisciplinary combination of novel animal behaviour neuroimaging setups and large-scale statistical methods, with the goal of recording and modelling whole-brain activity of locomoting vertebrates. We will study fictively swimming larval zebrafish during active behaviour in a pioneering experimental setup, recording neural activity utilizing light-sheet microscopy for calcium imaging in different virtual reality scenarios involving sensorimotor manipulations. In this setup, we will collect data from the distributed neural circuits that integrate sensory signals from the environment (exafferent input) and their own movements (reafferent input), as well as plastic processes of habituation to new sensorimotor contingencies. From this data, we will infer large-scale generative models (i.e. models capable of yielding synthetic data resembling the studied phenomena) of embodied neural circuits by complementing dynamical models and techniques from statistical mechanics with innovative information theoretic and Bayesian inference methods and approximations for very large systems in non-equilibrium and non-stationary conditions.</p>Miguel AguileraI’m happy to share that I have been granted a MSCA-IF grant to start a project at the University of Sussex in collaboration with Chris Buckley. The project aims to apply information-theoretic and inference methods to develop models of neural activity from zebrafish larvae in closed loop behavior, e.g. trying to apply theoretical methods for approximating the behaviour of very large networks and inferring their parameters from experimental data.Scaling Behaviour and Critical Phase Transitions in Integrated Information Theory2019-12-17T00:00:00+00:002019-12-17T00:00:00+00:00https://maguilera.net/blog/scaling-IIT<p>I just published a <a href="https://www.mdpi.com/1099-4300/21/12/1198">new paper</a> exploring some of the ideas we initiated in our <a href="https://doi.org/10.1016/j.neunet.2019.03.001">Integrated information in the thermodynamic limit</a> (Aguilera & Di Paolo, 2019) paper. Here, I explore in detail many of the assumptions of Integrated Information Theory (specifically IIT 3.0) by computing integration in large kinetic Ising networks presenting a critical point. By combining a simple model with tractable statistical properties that can be analytically characterized with architectures, I show that some assumptions in the theory are problematic for capturing some of the properties associated with critical phase transitions. This example compels researchers interested in IIT and related indices of complexity to apply such measures under careful examination of their design assumptions. Rather than applying the measure off-the-shelf, this work offers some methods to explore in more depth the assumptions behind the measure and how they apply to each situation</p>
<p>Aguilera, M (2019). <a href="https://www.mdpi.com/1099-4300/21/12/1198">Scaling Behaviour and Critical Phase Transitions in Integrated Information Theory</a>. <em>Entropy</em> 2019, <em>21</em>(12), 1198; https://doi.org/10.3390/e21121198</p>
<p><strong>Abstract</strong></p>
<p>Integrated Information Theory proposes a measure of conscious activity ($\Phi$), characterised as the irreducibility of a dynamical system to the sum of its components. Due to its computational cost, current versions of the theory (IIT 3.0) are difficult to apply to systems larger than a dozen units, and, in general, it is not well known how integrated information scales as systems grow larger in size. In this article, we propose to study the scaling behaviour of integrated information in a simple model of a critical phase transition: an infinite-range kinetic Ising model. In this model, we assume a homogeneous distribution of couplings to simplify the computation of integrated information. This simplified model allows us to critically review some of the design assumptions behind the measure and connect its properties with well-known phenomena in phase transitions in statistical mechanics. As a result, we point to some aspects of the mathematical definitions of IIT 3.0 that fail to capture critical phase transitions and propose a reformulation of the assumptions made by integrated information measures.</p>Miguel AguileraI just published a new paper exploring some of the ideas we initiated in our Integrated information in the thermodynamic limit (Aguilera & Di Paolo, 2019) paper. Here, I explore in detail many of the assumptions of Integrated Information Theory (specifically IIT 3.0) by computing integration in large kinetic Ising networks presenting a critical point. By combining a simple model with tractable statistical properties that can be analytically characterized with architectures, I show that some assumptions in the theory are problematic for capturing some of the properties associated with critical phase transitions. This example compels researchers interested in IIT and related indices of complexity to apply such measures under careful examination of their design assumptions. Rather than applying the measure off-the-shelf, this work offers some methods to explore in more depth the assumptions behind the measure and how they apply to each situationIntegrated information in the thermodynamic limit2019-03-20T00:00:00+00:002019-03-20T00:00:00+00:00https://maguilera.net/blog/IIT-limit<p>Together with Ezequiel Di Paolo, we have just published a new paper in which we explore how integrated information scales in very large systems. The capacity to integrate information is crucial for biological, neural and cognitive processes and it is regarded by Integrated Information Theory (IIT) proponents as a measure of conscious activity. In this paper we compute (analytically and numerically) the value of IIT measures ($\Phi$) for a family of Ising models of infinite size. This is exciting since it allows to explore situations that were very far to the kind of systems that can be generally analyzed in IIT, generally limited to a few units due to its computational cost. Moreover, our analysis allows us to connect features of integrated information with well-known features of critical phase transitions in statistical mechanics.</p>
<p>Aguilera, M & Di Paolo, EA (2019). <a href="https://doi.org/10.1016/j.neunet.2019.03.001">Integrated information in the thermodynamic limit</a>. <em>Neural Networks</em>, Volume 114, June 2019, Pages 136-146. https://doi:10.1016/j.neunet.2019.03.001</p>
<p><strong>Abstract</strong></p>
<p>The capacity to integrate information is a prominent feature of biological, neural, and cognitive processes. Integrated Information Theory (IIT) provides mathematical tools for quantifying the level of integration in a system, but its computational cost generally precludes applications beyond relatively small models. In consequence, it is not yet well understood how integration scales up with the size of a system or with different temporal scales of activity, nor how a system maintains integration as it interacts with its environment. After revising some assumptions of the theory, we show for the first time how modified measures of information integration scale when a neural network becomes very large. Using kinetic Ising models and mean-field approximations, we show that information integration diverges in the thermodynamic limit at certain critical points. Moreover, by comparing different divergent tendencies of blocks that make up a system at these critical points, we can use information integration to delimit the boundary between an integrated unit and its environment. Finally, we present a model that adaptively maintains its integration despite changes in its environment by generating a critical surface where its integrity is preserved. We argue that the exploration of integrated information for these limit cases helps in addressing a variety of poorly understood questions about the organization of biological, neural, and cognitive systems</p>Miguel AguileraTogether with Ezequiel Di Paolo, we have just published a new paper in which we explore how integrated information scales in very large systems. The capacity to integrate information is crucial for biological, neural and cognitive processes and it is regarded by Integrated Information Theory (IIT) proponents as a measure of conscious activity. In this paper we compute (analytically and numerically) the value of IIT measures ($\Phi$) for a family of Ising models of infinite size. This is exciting since it allows to explore situations that were very far to the kind of systems that can be generally analyzed in IIT, generally limited to a few units due to its computational cost. Moreover, our analysis allows us to connect features of integrated information with well-known features of critical phase transitions in statistical mechanics.Adaptation to criticality through organizational invariance in embodied agents2018-05-21T00:00:00+00:002018-05-21T00:00:00+00:00https://maguilera.net/blog/adaptation-to-criticality<p>I just published with Manuel Bedia a <a href="https://www.nature.com/articles/s41598-018-25925-4#Abs1">paper in <em>Scientific Reports</em></a> that results from an exploration of how tools from statistical mechanics could be used to model adaptive mechanisms. In this paper, we explore how adaptation to criticality could be used as a general adaptive mechanism in robots controlled by a neural network, using a simple mechanism that preserves a specific structure of correlations. This has interesting implications for thinking about neural and cognitive systems, which instead of relying on internal representations about an external world could adapt by preserving a complex structure of internal correlations.</p>
<p>Aguilera, M & Bedia, MG (2018). <a href="https://www.nature.com/articles/s41598-018-25925-4#Abs1">Adaptation to criticality through organizational invariance in embodied agents</a>. <em>Scientific Reports</em> <strong><span class="visually-hidden">volume</span> 8</strong>, Article number: 7723 (2018). doi:10.1038/s41598-018-25925-4</p>
<p><strong>Abstract:</strong> Many biological and cognitive systems do not operate deep within one or other regime of activity. Instead, they are poised at critical points located at phase transitions in their parameter space. The pervasiveness of criticality suggests that there may be general principles inducing this behaviour, yet there is no well-founded theory for understanding how criticality is generated at a wide span of levels and contexts. In order to explore how criticality might emerge from general adaptive mechanisms, we propose a simple learning rule that maintains an internal organizational structure from a specific family of systems at criticality. We implement the mechanism in artificial embodied agents controlled by a neural network maintaining a correlation structure randomly sampled from an Ising model at critical temperature. Agents are evaluated in two classical reinforcement learning scenarios: the Mountain Car and the Acrobot double pendulum. In both cases the neural controller appears to reach a point of criticality, which coincides with a transition point between two regimes of the agent’s behaviour. These results suggest that adaptation to criticality could be used as a general adaptive mechanism in some circumstances, providing an alternative explanation for the pervasive presence of criticality in biological and cognitive systems.</p>Miguel AguileraI just published with Manuel Bedia a paper in Scientific Reports that results from an exploration of how tools from statistical mechanics could be used to model adaptive mechanisms. In this paper, we explore how adaptation to criticality could be used as a general adaptive mechanism in robots controlled by a neural network, using a simple mechanism that preserves a specific structure of correlations. This has interesting implications for thinking about neural and cognitive systems, which instead of relying on internal representations about an external world could adapt by preserving a complex structure of internal correlations.Rhythms of the Collective Brain: Metastable Synchronization and Cross-Scale Interactions in Connected Multitudes2018-03-20T00:00:00+00:002018-03-20T00:00:00+00:00https://maguilera.net/blog/rhythms-collective<p>I have recently published a new paper where we model social coordination in self-organized crowds social media, using data from the 15M movement in Spain:</p>
<ul>
<li>Aguilera, M (2018). <a href="https://www.hindawi.com/journals/complexity/2018/4212509/">Rhythms of the collective brain: Metastable synchronization and cross-scale interactions in connected multitudes</a>. <em>Complexity</em> Volume 2018, Article ID 4212509. doi:10.1155/2018/4212509</li>
</ul>
<h4 id="abstract">Abstract</h4>
<p>Crowd behaviour challenges our fundamental understanding of social phenomena. Involving complex interactions between multiple temporal and spatial scales of activity, its governing mechanisms defy conventional analysis. Using 1.5 million Twitter messages from the 15M movement in Spain as an example of multitudinous self-organization, we describe the coordination dynamics of the system measuring phase-locking statistics at different frequencies using wavelet transforms, identifying 8 frequency bands of entrained oscillations between 15 geographical nodes. Then we apply maximum entropy inference methods to describe Ising models capturing transient synchrony in our data at each frequency band. The models show that all frequency bands of the system operate near critical points of their parameter space and while fast frequencies present only a few metastable states displaying all-or-none synchronization, slow frequencies present a diversity of metastable states of partial synchronization. Furthermore, describing the state at each frequency band using the energy of the corresponding Ising model, we compute transfer entropy to characterize cross-scale interactions between frequency bands, showing a cascade of upward information flows in which each frequency band influences its contiguous slower bands and downward information flows where slow frequencies modulate distant fast frequencies.</p>Miguel AguileraI have recently published a new paper where we model social coordination in self-organized crowds social media, using data from the 15M movement in Spain: