Wednesday, November 17, 2010

Possible functions to evolve using reaction networks

How would simulated evolution resolve the following challenges:

1) Measure the variance (noise) of a signal
2) Measure the frequency of a signal -- can be related to (1)
3) Reverse of (1), i.e. increase noise based on deterministic signal (without affecting mean)
4) Reverse of (2), i.e. increase frequency based on amplitude of an input signal
5) Control the width of a bimodal distribution based on a deterministic signal
6) Adapt to an external pattern of events (i.e. learn cause-delay-effect relationships) purely through the reaction network -- this would probably be a combination of signal processing + memory

Thursday, June 17, 2010

Design by homologous recombination

Engineers have frequently used random mutations as a means of optimizing a protein or genetically engineered network. However, I don't think this is an effective optimization or design process. Planned mutations, such as the one used by the immune system, rely on homologous recombination rather than random mutations. Using recombination, we can plan the mutation events, therefore perform a much more predictable optimization. The optimization process can even be simulated computationally.

Thursday, June 3, 2010

Cost-based hypothesis for evolution of modules

Suppose Amazon and eBay want some algorithm that finds common patterns is the customer purchasing history. Suppose a third company is developing exactly such an algorithm. It is cost-effective for Amazon and eBay to outsource this pattern finding problem to this third company, because the third company only charges half of what is required to get the job done. The third company is able to charge half the normal price because it gets payment from both companies. This can be considered evolution of modularity in economics, or niche finding.

The same reasoning might apply to modular structures in biology. It is perhaps easiest to think of an ecosystem first, where each species is like a company. If one species provides a function that is needed by the other species, then the ecosystem depends on that first species. In other words, the ecosystem will probably evolve so that there is a reserved seat for the first species. Now, moving the analogy to population of cells or population of genes within cells is a bit different, but I think some of the analogies still apply. The underlying idea is that the system favors those species that are necessary for the whole system to survive. And the reduction of cost is the incentive for new species to evolve and take a specific role in the system.

Monday, May 17, 2010

Directed evolution using engineered cells

Directed evolution of cells is generally done by setting up a screening process. An example is a binding assay for evolving cell surface receptors.

It is very difficult to build screening procedures that select for functions. However, it might be possible to engineer "killer" cells that attack cells with particular types of behaviors. Thus the killer cells provide the screening process. And why limit to a single type of killer cell.. of course, the killer cells should not evolve (might be an issue)

Imagine this scenario: a population of cells evolving in an environment with populations of 3 or 4 different types of killer cells. Each killer cell targets a particular type of behavior. Further, another population of "helper" cells excrete specific nutrients in response to particular behaviors. The target population of cells should evolve to avoid specific functions that are targeted by the killer cells and acquire specific functions targeted by the helper cells. Due to the existence of multiple criteria, the evolution might be more gradual as well.

Signaling - specifity and decoding

Lets consider wireless signals. The signals themselves travel in all directions, so there is no specificity. The frequency provides the sender-receiver specificity. The signal pattern contains information that the receiver can decode, i.e. the receiver must expect a specific type of pattern.

Comparing the general idea to biological signaling... the specificity usually comes from binding affinity, so that aspect of signaling is clear. Now for decoding the information. A pathway probably has multiple molecules serving as signal carriers. The pattern of concentrations of those input molecules *might* serve as the encoded information that the receiver, i.e. the pathway, is able to decode. The pathway then sends a new set of molecules as output signals . Note that this results in a conversion of signal carrier, which is analogous to the wireless signaling analogy where the wireless signal is decoded into some other form such as digital signals.

Sunday, April 18, 2010

Modules should span multiple layers

In biology, something like a protein domain is an ideal candidate for a "module" because it is a tool that can be reused at different places to serve different purposes. Nature does not want to reinvent tools; it would be more efficient to reuse existing ones -- protein domains are hard to invent, so it is a perfect item to reuse.

Moving on to network structure. Patters, such as feedback or feed-forward motifs, are not too difficult to reinvent (depending on how hard re-wiring is). For example, re-wiring genetic networks is easy. So it does not make sense to call any genetic network a "module". However, a combination of protein interactions combined with gene regulation might be a module.

For example, lets consider a network composed of a protein that responds to a small molecule and activates a protein that then upregulates a gene. This network is difficult to reinvent because it has multiple interactions that are very specific. It would be a module that is worth reusing. For example, the final gene product can be replaced with some other gene -- a simple way to reuse the module.

As an additional observation, I think it makes sense to say that modules span multiple "layers". For example, in electronics, logic gates convert analog circuits to digital. The example in the previous paragraph is a module that converts small molecule concentrations to gene regulation.

Thursday, March 18, 2010

Microbial ecosystem programming

There is a lot of hype about programming single cells by editing their genetic code, which in turn alters the dynamics of their regulatory or metabolic networks. However, using these engineered microbes in the real world is a very skeptical step, simply because we cannot predict exactly what can happen.

An alternative is to not edit the microbes themselves at all. Instead of building networks using enzymes inside the microbe, why not see a cell itself as a complex catalyst. A living cell converts some chemicals into others (environmental conditions apply).

Using microfluidics, it might be possible to completely characterize the "catalytic" profile of hundreds of microbial species, including bacteria, fungi, amoeba, algae, archaea. Then, build a "network" of different species such that the whole system is stable and performs some metabolic process that is of use to us, such as bio-remediation. This "engineered" network should be safer in the real world.

Saturday, March 13, 2010

Reducing stochasticity in biology

Suppose we are diagnosing a set of symptoms of a disease. If there is only one symptom, we will give our conclusion very little weight because that one symptom could be due to random chance. However, if we see multiple symptoms, then it is probably not due to random chance but due to some cause.

This simple rule is sufficient to eliminate stochastic effects in the final decision: make decisions based on multiple observations rather than a single observation.

In a cell, a "decision" can be something like upregulating a gene. If this decision is made by a single transcription factor that detects some sort of environment, then the decision (transcription) will be noisy. In contract, if multiple transcription factors that all respond to the same environment are used, then the transcription process will be a function of the sum of multiple stochastic processes. The sum will always have a lower variance. Unfortunately, multiple regulators means the there are multiple association/dissociation events, which would add more noise. The solution would have to be a bit more clever than this, but the general idea holds.