From OpenWetWare
Jump to navigationJump to search

Bottom-Up vs. Top-Down

Gabriel Wu 03:42, 13 February 2013 (EST): In principle, a minimal genome that has just enough genes for a cell to grow and divide sounds good. I wonder how reliable and stable this chassis might actually be though. What if removing all the "redundant" pathways results in a fragile cell where the addition of new genes results in cell non-viability? For example, biofuel products are notoriously toxic to the cell. If a bottom up approach is taken by starting with a minimal genome and then inserting ethanol genes, the minimal genome cell will likely never grow up due to alcohol toxicity. If engineering is an iterative process, it may be difficult to optimize no growth. Sometimes it's easier to start with a cell that can tolerate small amounts and then knock in or out whichever genes are needed to improve tolerance.

  • Neil R Gottel 15:55, 13 February 2013 (EST):An iterative approach: start with a low-expression version of your ethanol production construct that reduces growth, but doesn't kill the cell. Then, start adding in ethanol tolerance genes, looking for increases in the growth rate. Then bump up the expression of the ethanol gene again, and tweak the ethanol resistance again. Question is: will the end result of this be better than starting with a non-minimized, ethanol resistant cell? And how long would each approach take?
    • Gabriel Wu 01:45, 14 February 2013 (EST): I would be surprised if the bottom-up approach would be faster than top down. Also, if we knew all the interactions among ethanol tolerance genes, we might not need the bottom up approach to begin with. Also, are ability to "bump up" expression levels is less than well understood.
  • Thomas Wall 19:00, 14 February 2013 (EST): At this point in biology knowledge that a bottom up approach will be much longer. Figuring out what is causing your problem in a native genome is much easier then trying to figure out what you took out needed to be there (if its truly minimal). I think getting rid of non coding DNA might not be as big of a problem.
  • Jeffrey E. Barrick 11:10, 17 February 2013 (EST):I agree that it's much harder to build something than to selectively lose parts to get adaptation. Should we be thinking about building a maximal genome? As long as it survived with a ton of new genes, then it would be far more likely to have the potential to evolve useful functions.

Aurko Dasgupta 21:15, 14 February 2013 (EST):Given that the fitness effects of a gene can only be stated in the context of a particular genome, isn't the genome reduction method extremely haphazard? Many genes likely fall into families where the removal of one produces no change in viability, but prevent the removal or alteration of other genes.

Environmental Dependence

Max E. Rubinson 10:34, 14 February 2013 (EST): Shouldn't a "clean" or "minimal" genome refer to the minimum set of genes that an organism needs to survive and reproduce in a defined environment?

  • User:Evan J WeaverIEvan Weaver 17:52, 14 February 2013 (CST): Fixed it.
  • Thomas Wall 19:05, 14 February 2013 (EST): I believe that is what it is. But in the world of biocatalysts/metabolic engineering you are constantly changing the environment.
  • Marco Howard 19:05, 14 February 2013 (EST): When working with biocatalysts, do we know the rate constant? If so, we should be able to predict how the environment will change over time. Perhaps we could use Neil's iterative approach to design bacteria that can tolerate these conditions.

Yunle Huang 20:20, 14 February 2013 (EST): What are some advantages and disadvantages of each of the genome reduction methods?

  • User:Evan J WeaverIEvan Weaver 18:36, 14 February 2013 (CST): The papers I'd read had not listed any advantages compared to other methods, they just said their own methods and then about the cool reduced cell they created. Here's some stuff I thought of. Comparative genomics would find things that gene disruption wouldn't because gene disruption may disrupt a conserved essential gene. All of the 3 first techniques are a roadmap for genome reduction: without them reduction would take longer and be much more expensive.

Core and Accessory Genes, Pan-genomes

Thomas Wall 19:16, 14 February 2013 (EST): This is a cool paper I found talking about genome reduction that people might like (

  • User:Evan J WeaverIEvan Weaver 17:52, 14 February 2013 (CST): Yeah, this is the paper I mentioned in class. Thanks So much for linking it. It looks like it has a cool graphic as well that I may add.
  • Jeffrey E. Barrick 11:54, 17 February 2013 (EST):That paper brings up accessory genes as being even more easily lost (because they are genetically unstable). The opposite of an accessory gene is a core gene. This field of defining these two things has advanced a lot since the Blattner comparative genomics. For example, comparing 61 E. coli genomes.

Multicopy Genes

Kevin Baldridge 14:05, 14 February 2013 (EST):One of the things mentioned in the introduction is duplicate genes being superfluous. I wonder what the cutoff for minimal repetition of a gene, for example ribosomal RNA genes. If you only had one copy, I find it surprising that the cell would be able to be viable/healthy given the large amount of ribosomal RNA that is produced in exponential phase for cell growth.

Blattner Group

Neil R Gottel 17:25, 12 February 2013 (EST):I recommend that the work of the Blattner Lab on clean E. coli genomes should feature prominently in this article, since that's the state-of-the-art right now, and is a commercial product sold by Scarab Genomics. Is there a version of that sweet graphic that shows the deleted regions on the genome that we're allowed to use, because that'd be a great addition.

  • Jeffrey E. Barrick 12:51, 14 February 2013 (EST):Since it's publicly posted on a poster on the Scarab Genomics website, I think we can also post that figure.
  • User:Evan J WeaverIEvan Weaver 17:52, 14 February 2013 (CST): This is pretty cool. I didn't know he was selling these. While I do agree that his work was very important, I'd be sceptical about calling it state-of-the-art since his reduction was done over 10 years ago. Wouldn't a newer source of reduced strains be more reduced, genetically stable, or contain less genetic scarring?
    • Neil R Gottel 17:41, 16 February 2013 (EST):10 years ago? The Science article describing MDS42 is from 2006 (although the project was started in 2002, if that's what you mean). They've also continued to improve on MDS42, such as this 2012 paper in which three DNA polymerases that are induced by the SOS response were removed. The new strain has a mutation rate 50% lower than MDS42. With regards to scarring, they used lambda red recombination for in-frame deletions, which just leaves behind a FRT site.
      • User:Evan J WeaverIEvan Weaver 11:13, 18 February 2013 (EST):I was unaware that Blattner had more recent papers on reduced cells. I was referring to the paper that was published in 2002. I must not have seen his name.
  • Jeffrey E. Barrick 11:01, 17 February 2013 (EST):I think that it is a valid observation that the state of the art has not progressed much since the Blattner genome and Venter's work. Why is that? Have all the cool things been done? I'd say that MDS42 actually is a useful resource, particularly with the refinements that Neil mentions, but I don't really see any papers switching to using this strain. It's a lot easier to keep working with what you know.


Neil R Gottel 17:25, 12 February 2013 (EST):At some points this article has turned into a wall of text, particularly for the section on estimating the number of essential genes. I think it'd be helpful if a few people took a subheading and cleaned it up (and documented what they did here). For my part, I rewrote the section on "Genome Synthesis".

  • Kevin Baldridge 14:00, 14 February 2013 (EST):I made a couple of minor edits in the transposable mutagenesis and mRNA disruption sections for readability and capitalization. Furthmore, I changed the last sentence in the comparative genomics section, to change it from underestimation to overestimation -- it seems that if you assume a gene is essential from conservation but it's not, that would be a false positive and thus overestimate the resulting number of essential genes
    • User:Evan J WeaverIEvan Weaver 17:52, 14 February 2013 (CST): Thanks for catching that. It does seem like a Great Wall of text, would breaking each section into smaller sections make it more readable? For instance like "what it is" and how it works in one paragraph, then the caveats in a different paragraph.