Random Processes in Cells (April 12-14 1996)

A meeting in Cambridge made possible by the support of the Gatsby charitable foundation.

Location: Churchill College. Participants were accommodated in the college and the meetings took place in the study centre (between Churchill College and the Moller Centre, see map).


Dennis Bray, Simon Laughlin and David MacKay,
and other members of the Biological Computation group.


Sue Scott. Phone +44-1223-336614. Email Sue.Scott@bbsrc.ac.uk



(*) = short talk.

Registration: 12.00

Lunch: 1.00-2.00

Friday after lunch (2.00-4.15)

Chair: Dennis Bray - Theme: Introduction / single molecules in cells
Horace Barlow History, Noise and Prediction
James Michaelson The cellular dice game
R.M.Simmons Molecular motors

Tea: 4.15-4.45

Friday after tea (4.45-6.30)

Chair: Graeme Mitchison - Theme: Neural firing and VLSI firing
Michael N. Shadlen Noise, Neural Codes and Cortical Architecture
Alister Hamilton Analogue Silicon VLSI : noise, pulses and opportunities

Dinner: 7.00

Saturday morning (9am-10.45am)

Chair: Dennis Bray - Theme: Chromosomes
Tomas Lindahl Error frequencies during DNA replication and DNA repair
Robert Palazzo Chromosomal segregation in eucaryotes
(*) Purnananda Guptasarma Replication-induced (RI) transcription and the synthesis of low copy number proteins in Escherichia coli

Coffee 10.45-11.15

Saturday before lunch (11.15-1.00)

Chair: Simon Laughlin - Theme: Evolution and Genes; and other topics
Laurence D. Hurst Stochastic processes in evolutionary genetics
Other participants About three short presentations:
(*) Jonathan Hodgkin Incomplete penetrance and mutant sex determination in C. Elegans
(*) Susan Tweedie Measuring transcriptional noise

Lunch: 1.00-2.00

Saturday lunch to tea

Free for discussions and relaxation.

Tea: 4.00-4.30

Saturday after tea (4.30-6.15)

Chair: Horace Barlow - Theme: Vision
Trevor Lamb The random nature of phototransduction
Ed Pugh Opsin noise and the evolution of rods and cones
(*) Hans Liljenstrom and Peter Arhem Spontaneous Signalling: Roles of Spike Interval and Amplitude Fluctuations

Dinner: 7.00

Sunday morning (9am-10.45am)

Chair: Trevor Lamb - Theme: Synapses
Tony Zador Unreliable synapses: Bug or feature?
Simon Laughlin Noise in Cell Signalling

Coffee 10.45-11.15

Sunday before lunch (11.15-1.00)

Chair: David MacKay - Theme: Assorted
Robert F. Brooks The Origin of Cell Cycle Variability
Bill Bialek Physical limits to sensation and perception, revisited

Lunch: 1.00-2.00

Sunday after lunch (2.00-4.00)

Chair: John Daugman - Theme: Assorted
Other participants About three short presentations:
(*) Rudi De Koker Domain Structures, Diffusion, and Brownian Motion in Lipid Membranes
Howard Berg Random Events in Bacterial Chemotaxis

Tea: 4.00-4.30


Horace Barlow (Cambridge)

History, Noise and Prediction
The fact that noise very often limits performance has not been known for long, and the history of the recognition of its importance in biology will be briefly reviewed. There are always two aspects that have to be considered: the noisiness of the biological mechanisms themselves, and the variability in the environment that has to be ignored if biologically important regularities are to be exploited. It is the latter aspect that makes perception an inherently difficult task, and I shall describe a new view about the action of pyramidal neurons in the visual system (and probably elsewhere in the cerebral cortex) that arises from taking seriously the problem of abstracting useful signals from the noisy messages delivered to our senses from the environment.

James Michaelson

The cellular dice game
Because of our large size, chemistry has the appearance of a continuous process, but to cells, chemical change comes as a random hailstorm of molecules. My colleagues and I have called this cellular expression of the inevitable randomness of chemistry THE CELLULAR DICE GAME. For example, a simple calculation of the number of growth factor molecules bound per cell reveals that there are very few such ligand bound per cell, indeed, sometimes not even as many bound growth factor molecules as cells. Thus, at ordinary physiological conditions, there is only ONE Insulin-like growth factor II molecule bound for every TWO cells with receptors for this ligand. This inevitable randomness in the chemistry of single, discrete, molecules is also evident within cells, in the expression of the genome, although for reasons that depend upon quantum uncertainty. In my own lab we have found one very dramatic example of such random gene expression the expression of the genes coding for plasma proteins in the cells of the liver. These random features of cellular behavior may seem to contradict the highly ordered nature of multicellular life. However, multicellular organisms contain such enormous numbers of cells (10(exp)23 in a a human) that the ordered features of metazoan development may emerge as a natural consequence of the law of large numbers. Indeed, my colleagues and I have found that the familiar features of cellular growth in SHAPE, SIZE, and TIME, may be a result of the random allocation of growth factor molecules among cells, as revealed by mathematical analysis and computer simulation.


Molecular motors
In the last few years, advances in techniques for in vitro motility assays and nano-mechanics (optical tweezers, microneedles) have made it possible to record the unitary force and displacement produced in single interactions between a motor protein and its target protein filament (kinesin with the microtubule, myosin with actin). Native two-headed kinesin moves down a microtubule protofilament in 8 nm steps (equal to the tubulin dimer separation) probably in a hand-over-hand fashion, for considerable distances. Muscle myosin interacts transiently with an actin filament, with the movement per interaction being variously estimated at 4-17 nm. It is not clear whether the myosin interaction distance (or step size) is related to a conformational change in the myosin molecule or to the distance between one or more actin monomers (5.5 nm per monomer). In both systems thermal noise imposes limits on what can be measured-but is noise a nuisance or a necessity? Thermal ratchet mechanisms propose that the production of force and movement result from thermal noise, biased in one direction by ATP hydrolysis. Can such theories account for the mechanics and energetics of motor proteins? Or is a conformational change needed (and/or present)?

Michael N. Shadlen

Noise, Neural Codes and Cortical Architecture
Neurons in the visual cortex discharge action potentials with marked variability on their rate an d timing. Even under experimental conditions, when the same visual pattern is shown over and over again, the rate of neural discharge varies considerably from trial to trial. Moreover, during a period in which the neuron is responding at a nominally constant rate, the time interval from one action potential to the next appears nearly random. These observations suggest that the spike discharge of cortical neurons might be characterized as a stochastic point process. The irregular timing of spikes and the variability of spike counts poses constraints on the coding of visual information in cortex. To encode visual properties dynamically and accurately, the cortex must transmit multiple copies of messages. This design requires shared connections which, in turn, impose fundamental limits on fidelity, I will illustrate these principles using data from direction selective neurons recorded from rhesus monkeys during performance of a visual discrimination task. Second, I will propose a theory to account for the stochastic nature of the neural response. Specifically, the variable discharge arises as a consequence e of cortical microcircuitry that permits the neuron to integrate many thousands of synaptic events into a graded response. The problem is that cortical neurons receive a plethora of excitatory synaptic input, only a tiny fraction of which is sufficient to elicit an action potential. To avoid response saturation, the neuron balances excitation with inhibition -suggesting an analogy between the integration of synaptic activity toward a spike and a particle undergoing a rectified random walk to absorption.

This computational strategy might be relevant to other biological systems in which many input events are counted, more or less, and mapped to a narrow range of discrete outputs.

Alister Hamilton.

Analogue Silicon VLSI : noise, pulses and opportunities
This talk will introduce the topic of noise in silicon VLSI - how it arises and what can be done to remove it. A pulsed signalling mechanism originally inspired by biological neurons will be briefly discussed along with the opportunities that such a scheme provide for signal processing in VLSI.

Tomas Lindahl

Error frequencies during DNA replication and DNA repair
A mammalian cell is estimated to accumulate about 10 stable mutations per year, although much higher frequencies have been detected in some malignant cells. In order to sustain this very low error frequency, a number of control steps are required during DNA replication , such as editing of newly incorporated nucleotides and separate post-replicative mismatch correction events. These mechanisms are quite effective, so most human spontaneous mutations appear to result from endogenous damage to DNA by oxidation, hydrolysis, and accidental base destruction by reactive metabolites rather than by DNA replication errors. Thus mutations occur in a time dependent fashion, largely independent of the numbers of cellular replication events. Proof-reading of errors introduced during DNA replication is less efficient than that during replication, and mechanisms for DNA strand discrimination during repair are primitive compared to that available after DNA replication. Some DNA lesions are ignored by the repair system because they are not strongly mutagenic or cytotoxic, accumulation of such damage could conceivably contribute to slow deterioration and ageing of cells in long-lived organisms.

Robert Palazzo

Chromosomal segregation in eucaryotes
Cell reproduction requires the faithful replication of a mother cell's genetic material and the accurate and complete delivery of equal shares to two new daughter cells. In eukaryotes, replicated DNA is packaged into sister chromosomes which are segregated to daughter cells by an elaborate microtubule-containing apparatus known as the mitotic or meiotic spindle. The interaction of two key structural components, microtubules and kinetochores, are required for spindle assembly and chromosome segregation. Microtubules are anisotropic protein polymers which undergo stochastic oscillations between periods of polymerization (growth) and depolymerization (shrinkage). Kinetochores are discrete chromosome structures which contain microtubule-dependent motor proteins and associate with microtubules to provide the force required for directional chromosome movement during segregation. During spindle assembly, microtubules emanate from two microtubule-organizing centers, the spindle poles, and probe the cell cytoplasm by random polymerization/depolymerization reactions. Microtubules that contact kinetochores are stabilized, resulting in "chromosome capture". Since the kinetochores of sister chromosomes are oriented at 180 degrees, binding to mcirotubules from one spindle-pole orients the sister kinetochore to face microtubules emanating from the opposite pole. Once tethered to both poles through kinetochore-microtubule interactions, chromosomes align at the spindle mid-zone, segregation is initiated, and sister chromosomes move toward opposite spindle poles. To insure the fidelity of chromosome segregation, cells monitor the proper alignment of chromosomes on spindles. Natural or experimental malorientation of chromosomes restricts chromosome segregation until all chromosomes are properly aligned. Experimental destruction of unattached kinetochores or application of tension to chromosomes that are attached to only one pole can overcome this restriction, allowing premature chromosome segregation. Thus, kinetochore-microtubule binding and/or the physical tension which results from kinetochore-microtubule interactions may provide signals required for cells to segregate chromosomes and complete the cell division cycle.

Laurence D. Hurst

Stochastic processes in evolutionary genetics
Stochasticity affects evolutionary processes in at least two different modes. First, it is often assumed that mutation (and hence the creation of variation) is random in both direction and rate. The extent to which this is true is an open issue. That the per genome mutation rate appears to be constant from phage to fungi suggests that mutation is not simply something that happens. The extreme example of repeat-induced point mutation suggests that some mutation is directional (at least in a loose sense). These two examples indicate that to some degree organisms are adapted to control potentially stochastic effects. The extent to which mutational stochasticity determines the course of evolutionary history is also presently under re-consideration. The finding of repeated parallel events suggests that mutational accident must be less important than previously assumed. Second, stochastic effects can determine the fate of new alleles (ie what happens to the variation). Non-stochastic models of evolution lead one to suppose that advantageous alleles will always increase in frequency when rare and deleterious ones will always decrease. Models of the variance in gene frequency change due to stochastic effects (random mortality and Mendelian sampling) indicate that most advantageous alleles do not go to fixation and that some deleterious alleles will increase in frequency (possibly going to fixation). This body of theory is important for at least two reasons. First, that the rate of substitution of neutral alleles is thought to be independent of population size is critical to the existence of a molecular clock. Second, the realisation that mutational decay is a major problem results in the conjecture that selection's main action may be to promote the status quo (ie remove deleterious mutations) rather than promote adaptive change (as classical assumed). Sex and recombination may be critical to the promotion of the purging of deleterious mutations.

Trevor Lamb

The random nature of phototransduction
The signals in rod photoreceptors depend on the absorption of individual photons, and on the subsequent interaction of individual molecules of activated rhodopsin (R*) with other proteins - both for activation and for inactivation of the response. The protein molecules of the G-protein cascade of phototransduction undergo 2-D lateral diffusion at the surface of the disc membrane, and the interactions of the proteins occur stochastically as a result of collisions between molecules. A single R* catalyzes the activation of molecules of G-protein to G*, and these activated G*s in turn bind to, and activate, molecules of a third protein, the effector E, to form E*. Although it is possible to obtain a relatively simple analytical description of the first stage of reaction (G* activation), by analogy with the diffusion of heat in two dimensions, it has not yet been possible to provide an equivalent description of the second step of reaction (E* activation). However, stochastic simulations have proved a valuable tool in investigating both steps (Lamb, T.D., 1994, Biophysical Journal 67, 1439-1454). A notable finding has been that efficient coupling from G* activation to E* activation requires only a very low concentration of the effector protein E in the membrane. Interestingly, it seems that the concentration of effector in the membrane of real rods may be optimized: there appears to be just sufficient E present to permit reasonable coupling of G* to E*. An excess of E above this level would contribute spontaneous thermal activation without significantly aiding transduction. Recently, the stochastic nature of the inactivation reactions has been investigated, and results will be discussed.

Ed Pugh

Opsin noise and the evolution of rods and cones
Rods and cones, and their retinal pathways, have evolved to function in different intensity regimes: rods function well in light-impoverished conditions, harvesting almost every photon incident on the retina, while cones function well in bright illumination, and indeed never saturate to steady lights. The thesis to be presented is that the evolutionary pressures which led to the evolution of the two main classes of vertebrate photoreceptors acted initially and continually on the photopigments themselves, attempting to satisfy, as it were, competing constraints of the two different illumination environments. For example, a "rhodopsin" must be an ultraquiet enzyme in its "unactivated" form, if any one of the typically 10^8 or more molecules is to produce a reliable signal upon photon capture; this constraint leads to the hypothesis that the chromophore-opsin association should be very tight, minimizing structural fluctuations that lead to activity. In contrast, cone pigments should be capable of fast regeneration, which may necessitate a less tight fit between opsin and chromophore; a "less tight fit" may underly the chromophore exchange that occurs in the dark in cones. Dark chromophore exchange in turn may lead to a fluctuating population of naked opsins, whose enzymatic activity may cause ineluctable cone dark noise. A brief review of published results consistent with the thesis will be presented, and functional consequences discussed.

Tony Zador

Unreliable synapses: Bug or feature?
Synaptic transmission in the cortex is surprisingly unreliable: at some synapses, the failure rate can be 90% or higher, so that fewer than one out of every ten presynaptic impulses leads to a postsynaptic response. Furthermore, even the size of the response when it occurs is highly variable. This stochasticity is an intrinsic and fundamental property of synaptic transmission, difficult to reconcile with most theories of cortical function which assume that brain components are reliable. In my talk I will discuss the sources of synaptic unreliability and some of its implications for cortical function.

Simon Laughlin

Department of Zoology, University of Cambridge, England
Noise in Cell Signalling
Cell signalling involves molecular collisions and activation. Do these chance events introduce significant noise, and have cell signalling systems evolved to overcome these limitations? These questions can be approached in two ways. The first is to model signalling processes and deduce, from first principles, the physical constraints upon reliability. The second is to measure the degree to which intact signalling systems contaminate biologically relevant signals with noise. The compound eye of the blowfly exemplifies this pragmatic second approach. Two cell signalling processes are analysed, phototransduction via a 2nd messenger (PI) cascade and chemical synaptic transmission. Several sources of noise are described, their effects on signalling evaluated, and means of reducing these fundamental noise constraints demonstrated.

The absorption of photons by rhodopsin molecules follows Poisson statistics, so the rate of activation of membrane bound receptor molecules limits fidelity. Phototransduction operates with a high quantum efficiency to make full use of activated receptors. At high light levels the signal to noise ratio is limited by the available transduction machinery. The photon transduction rate is regulated to make optimum use of the finite number of transduction units in a cell and larger cells achieve higher SNR's. The cascade generates significant levels of transducer noise - identical inputs (rhodopsin activations) produce voltage pulses of different size. This limitation can, in principle, be reduced by compartmentalisation. Synaptic transmission is noisy, presumably from the random nature of vesicle release, and this limitation is reduced by matching gain to signal statistics. The high frequency components of the shot noise generated in signalling are eliminated by matched filtering. It is concluded that shot noise limits cell signalling, even in systems that are highly evolved to maximise the rates at which they transmit information. Paradoxically, these rates for molecular systems are far below the limits imposed by single protein molecules, suggesting that shot noise limitations stem from the need for signal divergence and amplification in fluid media.

Robert F. Brooks and Marek Hola

The Origin of Cell Cycle Variability
Cell cycle times are highly variable in general, but are correlated in sister cells. For mammalian cells growing optimally, the variability is consistent with a model in which cell cycle initiation is a random process (responsible for the experimentally-observed exponential distributions of differences between sister-cell cycle times) , together with a second source of variability that is "shared" identically by sister cells (responsible for the sibling correlation). In order to gain further insight, we have been making use of the ability of Xenopus egg extracts to induce the initiation of DNA synthesis (one of the earliest steps in the cycle after commitment) in isolated nuclei. Surprisingly, we have found that intact nuclei, prepared by scrape-rupture of quiescent cells, initiate replication asynchronously in egg extracts (as in vivo) despite exposure to a common level of initiation factors. However, the two nuclei of a permeabilized binucleate cell invariably initiate replication simultaneously, with different binucleates initiating at different times. It therefore appears that the time taken to respond to the inducers of DNA synthesis depends on some variable property of the nucleus that is shared by the sister nuclei of a binucleate cell. We suggest that this property is responsible for the sibling correlation in cycling cells . The nature of this property is as yet unknown, We are currently looking at the relevance of observed differences between nuclei in the rates of nuclear transport and in the levels of the Cdk inhibitor, p27.

W. Bialek

Physical limits to sensation and perception, revisited
The laws of physics impose fundamental limits on the performance of any device that is designed to make measurements on the world. Classic examples from vision include the noise due to random arrival of photons and the blur due to diffraction. Decades of work have shown that biological systems operate near these limits, and these results have influenced how we think about the mechanisms of sensory transduction. One might think that this line of reasoning has more or less dried up and, for example, that we understand (at least in outline) how it is possible for biological systems to count single photons. I will argue that, despite these advances, there are several open problems:
  • We understand the origin of gain in photon counting---why do ~ 10^8 ions cross the cell membrane when one rhodopsin molecule is isomerized by light---and we know that the principles of molecular amplification in phototransduction are applied universally throughout biology. We do not understand the regulation of this gain or the reproducibility of the single photon response, and this brings us to questions of noise in the internal dynamics of reaction network responsible for molecular amplification. These issues may be quite general. I will outline some theoretical constraints on mechanisms of gain regulation, focusing on the problem of reproducibility.
  • Signals from receptor cells are eventually discretized, as action potentials and/or as synaptic vesicles. This discretization imposes limits on how much information can be transmitted from one cell to the next, since this information must be less than the entropy of the discrete signal. Measurements on information transmission in spiking neurons and at the first synapse in fly vision indicate that the nervous system comes close to these information theoretic limits as well. These results have clear implications for the classic debate about rate vs. timing codes in the nervous system, as well as raising questions about our understanding of synaptic transmission.
  • Central processing of sensory information can be done with a precision that reaches the limits set by noise in the input data. Examples come from human psychophysics and from the problem of motion estimation in fly vision. Reaching this limiting or optimal performance is not just a matter of reducing noise, but of carrying out the correct computation. In the case of fly vision, this leads to theory of motion estimation that predict several new features of the input/output relations even for this very well-studied system. The crucial idea is that optimal processing involves a compromise between the raw sense data and some prior assumptions about what to expect; as we (or the fly) move through the world, these assumptions need to be revised, leading to a picture of adaptive computation. Recent experiments support this prediction, and there are large open questions about the structure of adaptation in more naturalistic stimulus ensembles.
All of these ideas have been developed in a world view where we know in advance what is signal and what is noise. In the laboratory, we can turn off the signal and calibrate our description of the noise statistics, but this is unrealistic for biology. The fact that the nervous system can reach the limits imposed by physical noise sources means, qualitatively, that the brain must learn, in an essentially unsupervised fashion, the distinction between signal and noise. I will present some preliminary ideas about how this might be done.

Howard Berg

Random Events in Bacterial Chemotaxis
The behavior of E. coli is stochastic: cells choose new directions at random, at random times. A fundamental limit on their ability to sense and respond to spatial gradients of chemical attractants is the time over which they are able to swim in a straight line. This limit is set by rotational Brownian motion. The longer cells are able to count molecules in their immediate environment, the more precisely they can determine concentrations and changes in concentrations. The motors that drive their flagella appear to be stepping motors that advance at random times. This can be shown by analysis of variance of rotation period. So, not only do cells of E. coli know a lot about stochastic behavior, stochastic analysis is an important part of their study.

Discussant contributions (abstracts)

Jonathan Hodgkin

Incomplete penetrance and mutant sex determination in C. Elegans
A common and important phenomenon in developmental genetics is incomplete penetrance: organisms of identical genetic composition may vary stochastically as to whether they exhibit a mutant phenotype or not. In the system I study, sex determination in the nematode Caenorhabditis elegans, it is possible to create mutant genotypes which may lead to perfect male, perfect female or intersexual development, apparently at random. In contrast, in the wild type, the reliability of sex determination is 100%. Some clues to the basis of the mutant variability are available. A second example from C. elegans is the signal transduction process involved in vulval development. Some lineage mutants exhibit classic incomplete penetrance, either no vulval induction or perfect induction. The molecular basis of this variability is becoming apparent.

Rudi De Koker

Domain Structures, Diffusion, and Brownian Motion in Lipid Membranes
Lipid monolayers at the air-water interface provide a convenient model system for theoretical and experimental investigations of the behavior of cell membranes. Binary mixtures of lipids can be prepared in a regime where two liquid phases coexist, and form a variety of domain patterns, ranging from striped phases to ordered or disordered arrays of circular domains. Theoretical work on these systems has focused on equilibrium patterns and domain shapes, Brownian motion of molecules and mesoscopic domains, and thermal fluctuation of the domain boundary. Proteins added to a phase-separated monolayer act as impurities, and aggregate preferentially in one phase or at the phase boundary. The domain structure of the monolayer thus has a significant effect on protein diffusion through the lipid matrix. This effect has been postulated to play a role in real cells, where it could help to regulate a variety of reactions occuring in or at the cell membrane. A brief review of theoretical ideas will be followed by a discussion of their biological relevance.

Hans Liljenstrom and Peter Arhem

On Spontaneous Signalling in Small Central Neurons Mechanisms and Roles of Spike Interval and Amplitude Fluctuations
Spontaneous activity seems essential for the information processing in the brain, for the development of synaptic contacts, for neuronal plasticity, memory, etc.
We are studying the spontaneous activity at several levels, ranging from experimental studies at the microlevels of isolated neurons and neurons embedded in brain tissue slices to simulations at the macrolevel of cortical network models. We have focused on small hippocampal neurons (<10 um diameter).
The reason for studying hippocampal neurons is that this cortical structure in spite of its relatively simple architecture seem essential for a number of cognitive functions such as memory formation. The reason for focusing on small sized neurons is that in spite of their abundance in cortical structures they are little investigated due to exprerimental reasons. Classical microelectrode measurements require relatively large cells like pyramidal cells >100 um diameter. However, with the introduction of the patch-clamp technique in the beginning of the eighties it has been possible to study small cells with good precision. We have used this technique to analyze small hippocampal neurons, isolated as well as embedded in relatively intact slices. For the latter recordings we used the technique to select cells under visual control.
The analysis of spontaneous impulse trains in these neurons, both in isolation and in slices, revealed two unexpected features: (i) impulses varying in amplitude, and (ii) single- channel events correlated with impulse currents.
The amplitude fluctuations imply a deviation from the all-or- nothing principle and suggest that amplitude modulation may complement the frequency modulation as mode of information processing in the brain.
Impulses correlated to single channel events, suggest single- channel induced impulses, making the stochastic nature of channel kinetics essential for understanding the global information processing of the brain. It also illuminates micro-macro transitions.
We will discuss possible mechanisms, and possible functional roles of this type of fluctuations. It will be shown by computational methods that the amplitude modulation is input related and extrinsic. The role of single channel induced impulses will be analyzed in simulations of cortical neural network models. Preliminary simulations show that small groups of spontaneously active neurons may induce syncronous oscillations, which may be involved in learning and memory functions.

Purnananda Guptasarma

Replication-induced (RI) transcription and the synthesis of low copy number proteins in Escherichia coli
Over eighty percent of the genes in the E.coli chromosome appear to express fewer than a hundred copies each of their protein products per cell. It is argued that transcription of these genes is neither constitutive nor regulated by protein factors, but rather, induced by the act of replication.The utility of such replication-induced (RI) transcription to the temporal regulation of synthesis of determinate quantities of low copy number (LCN) proteins is strongly emphasized. It is suggested that RI transcription may be both necessitated, as well as, perhaps, facilitated by the folding of thebacterial chromosome into a compact nucleoid. Mechanistic aspects of the induction of transcription by replication are discussed, especially in terms of the modulation of transcriptional initiation by negative supercoiling effects, promoter methylation status and derepression. RI transcription is shown to offer plausible explanations for the constancy of the C period of the E.coli cell cycle, and the remarkable conservation of gene order in the chromosomes of enteric bacteria. Some straightforward experimental tests of the hypothesis are proposed.

Susan Tweedie

Institute of Cell and Molecular Biology The University of Edinburgh Darwin Building, King's Buildings Edinburgh EH9 3JR Scotland
Measuring transcriptional noise
We are interested in trying to measure levels of transcriptional noise. We have used a quantitative PCR method to measure the number of processed transcripts for several genes in cell types where they should not be expressed. Initial studies in mammalian cells suggest that there are a total of around 6 such "illegitimate transcripts" in any given cell.
Our interest in this type of information stems the proposal that gene number has not increased continuously during evolution, but has risen in discrete steps. Although there is a shortage of data, it appears that one of the biggest steps has occurred at the invertebrate to vertebrate transition. We suggest that gene number, and hence biological complexity, may be limited in invertebrates by the necessity to regulate gene expression. Failure to repress genes efficiently in cells where they should be silent could result in an unacceptable level of transcriptional noise. As gene number increases the number of illegitimate transcripts per cell will also rise and consequently maintenance of cell type integrity could be compromised. Thus there may be a maximum number of genes which can be tolerated using the existing regulatory devices.
In theory, this potential block on gene number could be overcome by introducing a novel general mechanism which represses the level of transcriptional noise without affecting appropriate expression. DNA methylation could provide such a repression mechanism and we were interested to note that the vertebrate/ invertebrate transition is also associated with a change in the pattern of CpG methylation. We intend to use our PCR assay to look for differences in "noise" levels between methylated and unmethylated genes.

Other information

This page maintained by:
David MacKay <mackay@mrao.cam.ac.uk>
Last modified: Mon Apr 15 09:56:21 1996