Download Network models (gene regulation)

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
3
Network models (gene regulation)
Course Bioinformatic Processes 2016/2017; Paulien Hogeweg;
Theoretical Biology and Bioinformatics Grp Utrecht University
Last time:
CA as modeling tool
generalizations baselene expectations: “pattern default”
CA and ODE/MAP’s as dynamical systems
-alternative simplifications
-common features (types of attractors etc.)
- “almost all cases” ; order parameter
Model of models
- Mean field approximation/assumption
- dynamics of mesoscale ’entities’ (particles)
QUESTIONS?
TODAY:
modeling in terms of subsystems (cont)
Network models
Boolean networks as model for gene regulatory networks:
- multiple attractors (= celltypes)
- domains of attraction, reachability, alternative transients.
- Understanding/interpreting gene knockouts
- Dynamic properties of Encode human regulatory network
- Isolologous diversification
dynamical systems:
decomposition in many simple systems cont.,
NETWORKS
Neural net
connected
yeast transcription net
information transfer
Keg metabolic net
mass conservation
stochiometric
Gene regulation Networks:
“full” trascription network of yeast
How does it behave?
how special is it?
(evolution)
Boolean Networks
Proposed by S.Kauffmann (1969) as model for gene regulation
Like binary CA but
specific network structure (IO relations)
specific interaction (not local)
each node own transition rule
(boolean function with k inputs)
Boolean network : special cases can be mapped into CA
(homogeneous network structure, “rule-layer”)
Multiple attractors
What kind of behavior do we expect from gene
regulation networks?
multiple attractors (cell types)
alternative trajectories from A’ and A” to B
multiple causes
robustness (knockouts)
2 pathways to Neutrophyl differentiation
Huang et all 2005 (Phys Rev Letters)
gene expression through time
2773 dim statespace,
n2773 states!
trajectories in 2D projection
Robustness: Forcing structures
Properties of Random Boolean Networks (depending
on K)
Importance of sampling method: Dependence on K is
dependence on fraction (non) forcing rules!
Non forcing rules in 1D CA (k=2)
conclusion: Boolean Kaufman Networks
Important:
Identification of cell state with attractor of gene regulation
network
Multiple atractors in simple networks
alternative trajectories to attracotr
Domain of attraction: i.e. “robusteness”
forcing functions i.e. “robustness”
NOT IMPORTANT (WRONG!) connectivity of 2 “ideal”
Gene expression data −− > Boolean networks
Functional Overlap and Regulatory Links Shape Genetic Interactions between Signaling Pathways
Sake van Wageningen, Patrick Kemmeren,..... Berend Snel
and Frank C.P. Holstege Cell Dec 2010
141 kinases, 38 phosphatases in Yeast.
60% single knockouts “no phenotype”
(== <8 genes different of WT) (single growth condition)
Double knockouts: 21 buffering effects with other
kinase/phosphatasse
double knockout expression profiles
example of mixed epistasis
filamentous growth vs mating
2 simpler networks with same effect
(complexer network most similar to exp. inferred
network)
Many networks (max 2 inputs per node) with same
effect!
all buffering pairs: Many non-homologs!; many mixed
regulatory network via mixed response networks
Boolean networks
boolean functions vs threshold functions
how to make a XOR?
simple random network (threshold dynamics)
(Anton Crombach)
MULTPLE
ATTRACTORS
ONLY 10 nodes (=genes)!
STATESPACE
Human gene regulatory network
as reconstructed in ENCODE project
Gerstein et al 2012, Nature
some static properties, signatures of proximal network
(Gerstein et al)
what about dynamic properties (attractors)?
How “special” is the network?
Jelmer de Ronde, Msc project
Not only topology, but also type of interactions needed
Only partially known.
Therefore assign randomly and
study many realisations
each with many initial conditions
synchronous/asynchronous updating
attractors of (A) proximal ENCODE network
synchronous and asynchronous updating (markov chain)
Full graphs > 0.0001% cutoff
Full Synchronous
Full network ðV: 115825 ðE: 115825
Sub network ðV: 219 ðE: 219
1818
Full Asynchronous
Full network ðV: 198711 ðE: 1254886
Sub network ðV: 136 ðE: 299
H 219,74,136L
161
synchronous and asynchronous attractors similar
0
14
0
0
0
0
0
0
0
0
0
0
0
0
0
0
4
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
9
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
8
0
0
0
0
0
0
0
0
0
0
0
0
0
0
7
0
0
0
0
0
0
0
0
0
0
0
0
5
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
4
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
3
0
0
3
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
2
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
2
0
0
0
0
0
0
0
0
0
0
0
0
0
2
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
2
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
2
0
0
0
0
0
0
0
0
2
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
2
0
0
0
0
0
0
0
0
0
0
0
0
0
0
2
0
0
How special is the network?
compare with randomizations
how special is the network: proximal vs full network
ECODE network: yet another layer of control!
+++
e.g. epigenetic markers, DNA foldng/packaging etc.
Neural networks and pattern recognition
BOTH Gene regulatory networks and neural networks modeled as threshold functions
Binary or continuous activation; connection weights
Neural networks (as models and as)
Pattern Recognition tools and (deep) learning
Pattern recognition: attractors == patterns (Xi) to be recognized
Domain of attraction: recognized as Xi
Learning: adjust connection weights such that attractors of
network are indeed ”something”
Supervised and non-supervised learning
Distributed representation vs “grandmother cell”
Different connection topologies
Hopfield networks, fully connected, distributed
representation
multiple attractors of the dynamical system
max N-1 attractors/patterns in network with N nodes
2-node network:binary switch
pattern completion
Upscaling of NN models: 1 (now 10) billon
connections!
non-supervised recognition of cats
Le et al, Google 2012
comp. to Multilayer - specialized - sparse localized ’SOM’ but....
net architecture
weights of catneuron
best recognized cats
sparse encoding
Olshausen and Field, Nature 1996
Deep learning: multilayer supervised learning
long history since 2011 very successful (HOT)
(face recognition/speech recognition/navigation
back propagation BP
mostly combination of unsupervised and supervised learning
upscaling + little tricks
Overview single level (Autonomous) Dynamical
Systems
timing regimes
continuous var.
discrete var./
nominal entities
continuous time
ODE
EVENT
discrete time
MAPS
FSM
n-FSMs: CAs, B-nets
EVENT based models: continuous time, discrete
events
Gillespie algorithm
1: seen als stochastic ODE
Example: logistic stochastic population growth
dN/dt = aN − bN 2 + noise
EVENT based
all events (birth + death) :
e0 = (a1 + a2)N − b1N 2 + b2N 2
τ = 1/e0ln(1/rand1); T = T + τ
N=N+1 if (a1N − b1N 2) < rand2 ∗ e0
else N=N-1;
Related documents