Hello, I'm a phD student in bioinformatics. Recently I have been asked to model a discrete time event system from a data set. I was given a set of binary vector pairs representing the system evolution in [t,t+Dt]. Something like this: ... [0,0,1,0,1,0] [0,1,1,1,1,0] [0,1,1,0,1,0] [0,0,1,1,1,0] [0,0,1,0,1,1] [0,1,1,1,1,0] [1,0,1,0,1,1] [0,1,0,1,1,0] [0,0,1,0,1,0] [0,0,0,0,1,0] .... In practice, it is an abstract representation of viral evolution, in which each vector represents a viral genetic code and the elements represent the presece of a mutation in a place.. Considering the size of the binary vector (20), modelling the state space with a first order markov chain would require (at most) 2^20 different states: even though I found out this is not the reality (many configurations are not phisically possible), still I had to estimate a transition matrix of a hundred of states under a data set of five hundred transition pairs. So I had a look on petri nets and thought about modelling the same system using a nuber of places equal to the vector size (so, 20 places). But, what about the transitions? How could I estimate the topology of the net (also taking into account stochasticity of the events)? I know that reducing the set of states requires an exponential increase in arcs to reach the same expressional power of the markov chain. Are there algorithms that build the net from such data sets? Or is there any theory about that?
thanks in advance, Mattia. -------------------------------------------------------------------- mail2web - Check your email from the web at http://mail2web.com/ . ---- [[ Petri Nets World: ]] [[ http://www.informatik.uni-hamburg.de/TGI/PetriNets/ ]] [[ Mailing list FAQ: ]] [[ http://www.informatik.uni-hamburg.de/TGI/PetriNets/pnml/faq.html ]] [[ Post messages/summary of replies: ]] [[ [email protected] ]]
