We are interested in understanding how network function is shaped by underlying network architecture, and artificial networks enable us to explore the implications of different design constraints. With infinite resources, one could build a nearly perfect system (if one could define "perfect"...). But any real system faces constraints -- e.g. on material, on dynamic range, and on sensitivity to noise. How should these constraints be balanced to best achieve different functionalities?
In both biological and artificial networks, interacting elements must be able to efficiently communicate with one another. If we imagine that these elements can be connected by physical pathways (imagine towns connected by physical roadways), then perhaps efficient communication would be facilitated by building pathways between all pairs of elements (or roadways between every pair of towns). This would certainly enable fast communication between any two elements, but it would be quite expensive to build. If there is only enough material enough for a certain number of pathways (or only enough money to fund the construction of a fixed number of roads), where should these pathways be constructed? Is it better to build many short pathways, or a few long pathways, or some combination of both?
We are exploring these questions in networks of pulse-coupled oscillators. Within this framework, we can ask how variations in network topology shape the relay of information across a network (for visualizations of network dynamics, see the visualizations page). With appropriate modifications, we can account for constraints imposed by physical embedding and by variations in the fidelity of building materials.
While efficient communication might be an immediate goal of biological networks, it might also be important that these networks can integrate information over time. This involves both retaining past information (requiring some stability) and incorporating new information (requiring some flexibility). Intuitively, these seem to be competing goals -- is it possible to achieve a balance of both? If so, are there architectural designs that are better or worse at achieving this balance?
We explore these questions using feedforward artificial neural networks. Because they've been around for a while now, there is a variety of tools for studying their behavior. Yet, some basic questions, such as the functional role of different topologies, are still not well understood. We've shown that variations in topology shape the dynamic constraints on the network, which in turn gives rise to tradefoffs in stability versus flexiblity. These studies give insight into the ability of a neural network to balance memory (stable retention of old information) and learning (flexible integration of new information).
Biological networks change over time, particularly during growth and development. The resulting networks exhibit complex patterns of connectivity that differ from benchmark motifs, such as random and small-world motifs. While we have some understanding of the types of growth rules that could generate such benchmark motifs, the growth rules underlying biological networks (such as human brain networks) are not well understood. Furthermore, many existing models explore growth in non-embedded networks, where physical space constraints do not play a role. We are currently exploring the effects of physical embedding on network growth in models of the human brain.