As some of you all may know, this summer I'm working doing summer research related to graph theory and dynamical systems over graphs. In a graph dynamical systems setting, each vertex has a state (such as a binary state) and an update function. The update function looks at the current state of the vertex and its neighbors and produces a new state. So update function for vertex i over a set of states K is defined as fi : Kd(vi) + 1 -> K, where d(v) denotes the degree of vertex v. An example of an update function could be an XOR function (take the sum of the vertex and neighbor states over Z_{2}), threshold k (does the sum of the vertex and neighbor states add up to at least k), etc. Much of what we study is computationally intractable. Even if an algorithm is linear on the number of inputs, we can have an input exponential in size. So in the binary case, we have 2^{n} distinct state vectors with which to deal.
Over the last couple weeks, I've been working on Python simulations (yes- a new language!) for graph dynamical systems looking at a specific class of function. My simulations produced some nice insights for long-term behavior over complete graphs. After talking with my boss, he wanted me to look at short term results. More specifically, a single iteration of the update sequence as opposed to multiple updates. The idea is to see if we can gauge short term activity at some level (still over complete graphs, at the moment).
When dealing with long-term activity, it is much easier to spot patterns. Common patterns deal with fixed and periodic points. After so many iterations, do we bounce back and forth between the same two states? Is there a common pattern with certain Hamming classes or configurations going to the same fixed point? Simulations are great for looking at long-term behavior, as the computer does the grunt work of the 10-15 iterations run over various random state vectors.
However, when dealing with short term activity, it's much harder to obtain this insight via simulation. A single update on a dynamical system won't necessarily give you a fixed or periodic point. In this case, I'm looking at perturbations. If I have a state x in K^{n} and I change the value in one vector component (call the new vector x'), will F(x) = F(x'), where F is the update function? This analysis was actually easier to do by hand, and it really came down to understanding some basic combinatorics.
The moral of the story is that it's just as important (and exciting!) to be able to crank out code as it is to do math! At least, in the wonderful world of graph theory, it is!
Over the last couple weeks, I've been working on Python simulations (yes- a new language!) for graph dynamical systems looking at a specific class of function. My simulations produced some nice insights for long-term behavior over complete graphs. After talking with my boss, he wanted me to look at short term results. More specifically, a single iteration of the update sequence as opposed to multiple updates. The idea is to see if we can gauge short term activity at some level (still over complete graphs, at the moment).
When dealing with long-term activity, it is much easier to spot patterns. Common patterns deal with fixed and periodic points. After so many iterations, do we bounce back and forth between the same two states? Is there a common pattern with certain Hamming classes or configurations going to the same fixed point? Simulations are great for looking at long-term behavior, as the computer does the grunt work of the 10-15 iterations run over various random state vectors.
However, when dealing with short term activity, it's much harder to obtain this insight via simulation. A single update on a dynamical system won't necessarily give you a fixed or periodic point. In this case, I'm looking at perturbations. If I have a state x in K^{n} and I change the value in one vector component (call the new vector x'), will F(x) = F(x'), where F is the update function? This analysis was actually easier to do by hand, and it really came down to understanding some basic combinatorics.
The moral of the story is that it's just as important (and exciting!) to be able to crank out code as it is to do math! At least, in the wonderful world of graph theory, it is!
6 Comments On This Entry
Page 1 of 1

astonecipher
01 August 2014 - 07:17 PM
All I can say is, you're one smart kid! (kid is relative to my age vs yours and is in no way ment to be derogatory.)
Page 1 of 1
← April 2021 →
S | M | T | W | T | F | S |
---|---|---|---|---|---|---|
1 | 2 | 3 | ||||
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 | 26 | 27 | 28 | 29 | 30 |
My Blog Links
Recent Entries
-
-
-
Knowing When to Write Code and Knowing When to Do Math
on Jul 31 2014 10:24 PM
-
-
Recent Comments
Search My Blog
3 user(s) viewing
3 Guests
0 member(s)
0 anonymous member(s)
0 member(s)
0 anonymous member(s)