What’s the catch?

if someone were to try to create a Neanderthal a few years from now, starting with ancient DNA, they’d have to have worry a lot about data errors, because such errors would translate into mutations, which might be harmful or even lethal. Assume that we have figured out how to get the gene expression right, have all the proper methylation etc: we have modern humans as a template and you know there isn’t that much difference.

They might try consensus averaging – take three high-quality Neanderthal genomes and make your synthetic genome by majority rule: we ignore a nucleotide change in one genome if it’s not there in the other two. ‘tell me three times’, a simple form of error-correcting code.

But doing this would cause a problem. Can you see what the problem is?

This entry was posted in Genetics, Neanderthals. Bookmark the permalink.

64 Responses to What’s the catch?

  1. Hm says:

    Well, normal variation, like SNPs? If you took three humans at random, the percentage on which they all agreed might be way smaller than you expect, and cut out most of what makes modern humans different from other humans. Also, are you forcing the nucleotide to be same, or the coded protein? (I’m not an expert, but I get the sense that the second condition is way more involved to compute.)

    It looks to me like you’d be better off with 2 out of 3, but I don’t have a good error model. Would we expect our errors to be uncorrelated? Did the Neanderthal samples live in the same region?

  2. Guess says:

    Alleles have frequencies other than 0 and 1. For an allele with frequency f, it will only show up in our neo-Neanderthal with probability f^3. This will create huge systematic differences in genotype, and phenotype. These might be lethal, and certainly would produce something very different from the original Neanderthals.

    • gcochran9 says:

      I doubt if there would be systematic changes in phenotype – but there could be problems in alleles subject to frequency-dependent selection, like HLA. .

      Improve the algorithm. Take 50 high-quality Neanderthal genomes, and use a probabilistic algorithm that chooses the 10% allele 10% of the time.

      The real problem still exists.

      • Guess says:

        Dubious hypothesis:

        Doing nucleotide-by-nucleotide comparisons neglects higher-level structure. Say that you have a gene that produces protein X. And it turned out that some modification to gene X was helpful, say reducing its activity. Then 2 mutant versions of the gene arise, each reducing the activity of the gene in some different way. They are both selected for initially, but act as substitutes for each other, and wind up 50:50 in the population. Everyone carries copies of the modified versions, but not necessarily the same one.

        When we use our probabilistic nucleotide-by-nucleotide comparison, we get a 25% probability of having neither version, 50% of having one, and 25% of having both nucleotide changes. The “both” and “neither” cases could be bad.

      • Anonymous says:

        yes, that’s my view too…best example I think of is blood types, allele that have existed for a long time in primate lineage iirc…an average between a blood-A and blood-B genome is probably not viable…for Neanderthal, or for us sapiens 😉

      • aisaac says:

        Does it have something to do with deletions, inversions, or insertions? Reading one genome out of frame with the other?

    • misdreavus says:

      @Guess, Anonymous

      Wouldn’t you expect recombination to make deleterious effects of this nature somewhat rare? (Well, at least not for frequency-dependent alleles that are that common within a random sample of Neanderthals.) We are, after all, a diploid species. The gene products almost always end up folding properly, no matter how much you mix up the primary sequences. We’d all be long gone if this weren’t the case.

      We know that considerable diversity in the ABO blood group system among human beings is the product of recombination. You would think if the glycosyltransferases were responsible for a function that was that important, as well as _that_ sensitive to nucleotide shuffling of this nature, everybody should have two copies of same allele. But we plainly do not.

      • Anonymous says:

        yes, this is a good point, but if a simple nucleotide average was viable, how could there be discrete genotype variants maintained for a long time, like for blood types? the only way imho is that averages are not viable (or strongly fitness reducing), or that discrete types are phenotypes-only, at the gene level there is a continuum of variations…if the kind of correlation I was speaking about exists, there must be something keeping it despite recombination…and that would be reduced fitness for simple nucleotide averaging…

  3. Rum says:

    regression towards the mean.
    I mean, N thals were like wolves to our coyote ancestors. Within the same species but not the same.

  4. misdreavus says:

    Neanderthals, just like human beings, have a diploid genome.

  5. ron says:

    Problem is that merging multiple strains is a form of evolution itself, so we don’t get a meanderthal but something else?

  6. Anonymous says:

    You’d make a Neanderthal which was much smarter, taller, and healthier than both the average Neanderthal of his day, and the average human.

    Forgetting about the Neanderthal: if you tried to apply that technique (consensus averaging) to a human, you’d arrive at a human with a dearth of the fail alleles which cause human variation on a number of QT’s, particularly the nervous system (affecting intelligence). Virtually all of the non-consensus variants are function-reducing, and virtually all of these imperfections would be removed by the above consensus averaging. In laymans terms, the human would probably have an IQ far higher than any human that has ever lived. Far higher –> very far higher. Using the term correctly, IQ would likely break down as a metric at this level, because there would be such a huge gap between the synthetic organism and its closest runner-up.

    Doing consensus averaging on a Neanderthal would have the same effect, plus make the super-organism – the likes of which has never existed in the wild – a Neanderthal. It would probably be a fantastically interesting organism. A “problem” about the situation, if you could call it that, is that retarded non-genomicists who don’t understand that most human variation is simply “bad” and function-reducing mutation would conclude that this has to do with how superior the Neanderthal species was compared to us, and how evil the white man was for wiping out the natives when we moved into Europe.

    Frankly though, the “problems” that would emerge from making a consensus organism would pale in comparison to how fucking awesome it would be to study.

    • Hm, didn’t log in. Above post is me.

      • Anonymous says:

        hum, the averaging is done at the nucleotide level? then, there is a problem when one nucleotide is poorly correlated within the population, but highly correlated to other (neighbouring) nucleotide in the same individual…this is very common, no? a gene with multiple well defined variants fit this definition, and this is an allele I think…after the averaging u end up with new genes. not any of the pre-existing alleles…New gene may be completely non functional…
        any other cross-correlation between nucleotides in 1 individual will cause this problem, not only within genes…but the cross correlation that are not respected by sexual reproduction are probably not life threatening…

        or my post may be completely rubbish, I am still half asleep 😉

      • Anonymous says:

        self-responding: my post is probably not rubbish: let’s check a practical example: the gene coding for blood type. variants have existed for a long time I think, before Neanderthal speciation iirc…so I doubt that averaging A and B would produce something viable, in either Neanderthal or sapiens…

      • Yeah, what you’re describing is a real thing. It’s mostly called “outbreeding depression”. Complexes of interdependent variants can be disrupted. It happens in nature all the time, and it’s ultimately the cause of speciation.

        That’s balanced by something called “outbreeding vigor”, AKA hybrid vigor, AKA reduction of inbreeding depression. Where the opposite happens. It’s why humans are attracted sexually to the smell of mates who are not too genetically similar, we can smell markers of a mates immune system, and benefit from genetic diversity on this trait.

        All in all, compared to consensus averaging to the reference sequence, the effect would likely be infinitely tiny.

    • gcochran9 says:

      He might not like us.

    • nameless37 says:

      Curious theory … but I recall dealing with a similar story a few months ago, so I happen to have a counter-argument.

      You cannot assume that any non-consensus variant with any significant presence in the population (say, >1%) is function-reducing. If it were function-reducing, it would be selected out. A de novo mutation whose primary effect is a genuine loss of function (for any value of “function”) would never reach 1% of the population. Most rare (but not exceedingly rare) alleles were beneficial and underwent positive selection at some point in our genetic history, or were major alleles at some point and were substantially but incompletely selected out.

      For a new function or a non-selected trait, any major allele is equally likely to be positive or negative. Take height as an example. It’s nicely heritable and it has not been strongly selected for. Here’s a study identifying 20 SNPs with strongest correlations with weight:
      http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2681221/ Out of 20, the major allele corresponds to higher height in 12 cases, and to lower height in 8 cases. The difference is not statistically significant.

      Now let’s talk about IQ. If most rare IQ-affecting alleles are “bad”, it would mean that we’ve been undergoing selection for IQ for a long time. And if that were the case, we’d see poor heritability of IQ (since alleles with strongest effects would be mostly fixed by now) and inverse Flynn effect (mean intelligence declining from generation to generation due to de novo mutations). In reality, we see the opposite, high heritability and no sign of inverse Flynn. Which is a good indication that we have NOT been selected for IQ for all that long (my personal guess would be that there aren’t any populations in the world, with possible exceptions of Koreans and the Japanese, which were undergoing significant selection for IQ more than 1000 years continuously.) And so there should be plenty of IQ-reducing major alleles in the genome.

      • You cannot assume that any non-consensus variant with any significant presence in the population (say, >1%) is function-reducing.
        I didn’t say “any”. A majority effect compels the argument.

        If it were function-reducing, it would be selected out. A de novo mutation whose primary effect is a genuine loss of function (for any value of “function”) would never reach 1% of the population.
        Wrong, and stupid.

      • Anonymous says:

        You cannot assume that any non-consensus variant with any significant presence in the population (say, >1%) is function-reducing.
        The procedure doesn’t just remove high frequency variants. It removes low frequency ones, likely to be detrimental, with even greater certainty.

        If most rare IQ-affecting alleles are “bad”, it would mean that we’ve been undergoing selection for IQ for a long time. And if that were the case, we’d see poor heritability of IQ (since alleles with strongest effects would be mostly fixed by now) and inverse Flynn effect (mean intelligence declining from generation to generation due to de novo mutations).
        We’d only see poor heritability if mutational load was not significant, and evolution for IQ had been slow enough that all selective sweeps are complete. Inverse Flynn would occur if selection for IQ was now relaxed or gone (very likely) but would not be that strong in the case of small-effect genes since a mutation with 1% fitness cost (for example) should have a rate 100 times the amount that’s added de novo in each generation.

      • nameless37 says:

        You didn’t say “any”, you said “virtually all”.

        “We’d only see poor heritability if mutational load was not significant, and evolution for IQ had been slow enough that all selective sweeps are complete. ”

        All we need to have poor heritability is for environmental effects to swamp genetics.

        “Inverse Flynn would occur if selection for IQ was now relaxed or gone (very likely) but would not be that strong”

        No, it would not be strong. But we seem to be observing the wrong sign altogether.

        The whole argument would be quite absurd if you step back and look at it objectively. IQ is a measure of one’s skill at mental manipulation of symbols with extensions to literacy and numeracy. Humans weren’t selected for IQ for long, but most of them were selected in relatively recent times. Is it likely that mechanistic “cleaning” of Neanderthal genome would make that Neanderthal, whose ancestors never even dreamt of letters or numbers, able to manipulate these symbols in his neocortex with performance that is superior to modern humans? If so, why not assume that he’ll also be able to lift heavy pianos and run 1:30 marathons? It’s fairly self-evident (to me, anyways), that, even if one could engineer an individual that is capable to run a 1:30 marathon, it would definitely not suffice to clean up the genome, one needs specific modifications to muscles and the cardiovascular system, with assorted side effects.

        And why stop at Neanderthals: why not apply the same process to chimps?

      • nameless37 says:

        Also, as an aside, one aspect that’s notoriously underrated with respect to genetics of the IQ is copy number variations. Many (most?) known genetic adaptations that separate humans from primates (for example, human abilities to digest grains and tubers) are not SNPs, they are CNVs. There was a study in 2011 that estimated that 45% of IQ variance within the Anglo/White population was accounted for by rare copy number deletions. (By “rare”, they meant CNVs occurring in fewer than 9 of their 196 subjects; the vast majority were seen in 1 or 2 subjects.) No one ever got anywhere near 45% even in the most extensive genome-wide SNP studies. And yet somehow most scientists seem to be stuck in the SNP mindset. Possibly just because testing for SNPs is nowadays easy and ubiquitous, while testing for CNVs is more complicated and expensive.

        This does not change the argument much; de novo CNVs are more frequent than de novo SNP mutations, but much of the logic still applies.

        • gcochran9 says:

          You are mistaken: de novo SNPs are much more common than de novo CNVs. A baby has ~ 60 new SNPs, of which maybe 10% are deleterious. Mutation rate for new CNVs is about 1.2 x 10-2.

      • Anonymous says:

        The whole argument would be quite absurd if you step back and look at it objectively. IQ is a measure of one’s skill at mental manipulation of symbols with extensions to literacy and numeracy. Humans weren’t selected for IQ for long, but most of them were selected in relatively recent times. Is it likely that mechanistic “cleaning” of Neanderthal genome would make that Neanderthal, whose ancestors never even dreamt of letters or numbers, able to manipulate these symbols in his neocortex with performance that is superior to modern humans?
        The fact that Neanderthals had large expensive brains proves that being smart had adaptive value to them, symbol manipulation or not.

    • gcochran9 says:

      You would probably do a more sophisticated version of this, not necessarily always at the nucleotide level : maybe haplotype consensus, exceptions for inversions and copy number variation, etc. But the point is that a theoretically unsophisticated geneticist (the most common kind) could create a (possibly inimical) superbeing by accident, if he used any kind of data fusion that removed rare variants. Creating superbeings by accident is a stock notion in comic books and science fiction, involving everything from radioactive spiders to toxic waste, but this is the first plausible scenario I’ve heard of.

      • DB says:

        There’s plenty of variation in e.g. dog intelligence, so it should be possible to get a feel for the potential impact of spellchecking human genomes by observing the impact of spellchecking animal genomes.

      • Robert E. Howard really ought to have been a reader of this blog. I hadn’t thought about that, but it would make a really solid scifi.

      • Lemniscate says:

        Wouldn’t it be better to create a super-modern? I guess it doesn’t have as much of the cool sci-fi aspect, but it would be more dangerous than a super-Neanderthal, especially if it started cloning itself.

      • Guess says:

        Don’t the Africa-Europe difference in load show that the effects wouldn’t be overwhelming? Big differences (20%-30%) in mutational load, but quintupling that phenotype difference (assuming all genetic causes) wouldn’t be overwhelming or outside the human range.

        A big group would be scarier but one individual seems safe. And as others say, this would be tested on mice and farm animals before Neanderthals, indicating the impact of the procedure.

    • Greying Wanderer says:

      “to how fucking awesome it would be to study”

      heh

  7. Miley Cyrax says:

    Because playing “majority rules” with genomes would be de facto acknowledgement that less mutational load is better–however, mutational load differs between sapien population groups–so we can’t have that now can we?

  8. Lemniscate says:

    Apart from the ‘problem’ of removing deleterious rare variation, it would also create an ‘ancestral’ Neanderthal: any alleles derived since the last common ancestor of the three Neanderthals would not be called. If the Neanderthals sequenced were closely related this might not be much of an issue, although then they would share a lot of rare variation and the Neanderthal created would not be quite as super. If we took three remotely related Neanderthals (different Neanderthal races) then we would create a super-ancestral-Neanderthal. I bet it would win world’s strongest man.

    • Keep in mind, the human reference sequence is the default.

      The more distant the three “Neanderthal” sequences are from one another, the more your synth would just be a perfect reference Homo Sapiens Sapiens with a few Neanderthal quirks.

      • Lemniscate says:

        But the process should still remove all the alleles derived on the modern line since the nenderthal-modern common ancestor. It wouldn’t be like a modern human, but it would push closer and closer to the Neanderthal-modern split, with all the deleterious rare variation removed.

  9. Imagine a case of one locus.

    There’s
    1) Human refseq allele
    2) Neanderthal a allele
    3) Neanderthal b allele
    4) Neanderthal c allele

    If 2), 3), and 4) disagree, you revert to 1). So, the more disagreement between Neanderthals, the more dominant the human refseq is. Conversely, the more agreement between a, b, and c, the more Neanderthal you get in the Sapiens/Neanderthalensis hybrid.

    • Lemniscate says:

      That’s only a problem if you have a locus where the modern sequence carries an allele derived since the Neanderthal-modern split and there has been a separate mutation on the Neanderthal line since the last common ancestor of the three Neanderthals. That is going to be quite a rare occurrence.

      If there has been a mutation on the modern line and not the neanderthal, however, then 2,3,4 are in consensus and the ancestral variant is called. This is a much more likely occurrence and will remove most of the variation unique to modern human evolution.

      • The latter is by far the more frequent occurrence, sure. But what’s your point in that? The case of heightened genetic distance will still cause this (rare) occurrence to occur more frequently than the case of lowered genetic distance.

        The common ancestor dominance you describe, and human refseq dominance I describe, would both increasingly factor for increasing genetic distance.

        Also, keep in mind that loci with more than two common alleles are not that rare in a single, nonsplit population. And older, pre-bottleneck Neanderthalensis had a lot more SNP variation than modern, post-bottleneck Sapiens do, and increasingly so the further back you go.

      • Lemniscate says:

        I agree that taking more distantly related ‘Neanderthals’ would lead to more of both occurrences, but I was just disagreeing with your original statement that this would lead to a ‘perfect reference Homo Sapiens Sapiens’, as most of the modern derived alleles would be changed to either ancestral or Neanderthal alleles — when there has been a mutation common to all three Neanderthals at the same locus as a modern derived allele. It would certainly become less (late) Neanderthal but it would do this by mainly becoming more ancestral than modern.

      • Lemniscate says:

        I guess this brings up the question of how ‘ancestral’ (and how low in genetic load) we could go in birthing a hominid…I think we’d need a super-adventurous woman for that.

  10. observer says:

    Fascinating scenario. As Redzengenoist said above, you’d effectively create a Super-Thal (possibly super tall as well) by spell-checking the genome.

    However, one wonders if there would be serious problems caused by excessive height and excessive brain size, leading to, e.g., myopia and hideous migraines if the cranium doesn’t increase in the right proportion to the brain; or trouble standing up if the height doubles but the heart size only increases by X%, etc. Current anatomical proportions are calibrated based on the fact that deleterious mutations are inevitable, and if they disappeared you can’t assume that the structural proportions of the whole would remain functional, because not all proportions would be altered by spell-checking to the same degree, nor would they be automatically scaled by size in the way that they need to be. Consequently it seems probable to this uneducated person that the Super-Thal would die in adolescence because of flawed anatomy.

    But how about this more likely scenario: accidentally spell-checking the Mammoth genome. Now _that_ would be interesting. And if we keep quiet it could even happen by accident not too long from now.

    African elephants already have the most neurons of any non-human animal. According to Cochhran’s theory there may also be a general intelligence benefit the further animals live from the equator. So it wouldn’t be implausible for mammoths to be smarter than African elephants in the first place. But–if you spell-checked the Mammoth genome, is there not a chance you would create a human-level intelligence organized in a completely non-human (non-primate) fashion?

    • Matt says:

      Current anatomical proportions are calibrated based on the fact that deleterious mutations are inevitable, and if they disappeared you can’t assume that the structural proportions of the whole would remain functional, because not all proportions would be altered by spell-checking to the same degree, nor would they be automatically scaled by size in the way that they need to be.

      Complex organisms have a lot of feedback mechanisms to make sure the anatomy coevolves – e.g. the braincase pretty much forms its shape and size based on brain growth, with some relatively small influence by the face and in the opposite direction (rather than the shape and size of the braincase and brain be developmentally and having natural selection more or less sort it out by calibrating the two, which wouldn’t really work very well).

      These kind of mechanisms might improve under an environment of less load (the mechanisms seem to me likely to present fairly large targets), although perhaps not sufficiently, so the point may still stand.

  11. Tamerlane says:

    This is a bit off topic but one issue that I have not seen discussed in any of these proposals to “recreate” extinct forms of life is an issue that I’ll characterize as uterine epigenetics. I’m curious whether my thoughts on this matter are totally off-base or not.

    Based on the headlines, the zygote containing Neanderthal genetic material will be place in a modern humans womb where it will be maintained for some period of time by human circulation. But we don’t really know the characteristics of the Neanderthal womb nor the chemical/nutritional character of Neanderthal blood. We don’t have a clue about Neanderthal gestation periods. All of these will have a profound effect on fetal development and the final gestational product. (Another, related problem is that the ovum containing this material will have human mitochondria and, perhaps, a chemical composition different than a Neanderthal ovum.)

    Similar issues exist for, e.g., bringing a “mammoth” zygote to term in an elephant’s womb. The problems become even more severe for oviparous species: inserting T Rex DNA into a hen’s egg is extremely unlikely to generate any useful outcome.

    • Nyk says:

      We do know for a fact that sapiens x neanderthalensis hybrids have existed in the past and were healthy enough to pass on their genes within the out-of-Africa population.

  12. Matt says:

    Sounds particularly disturbing if were to “Bring me the clone of Genghis Khan!” (assuming some sufficient fossilized remains with corrupted adna were found), or any similarly psychopathic (generally any ruling class monarch?) historical figure, rather than neanderthal.

    (Probably insufficient and weak) Hedges against Neanderkhannooniensingh might include
    – Clone a female
    – Filter for violence associated genes of large effect (MAO-A).
    – Use as much of the genome of a Neanderthal that shows signs of being cared for as possible (on the basis that such an individual probably wouldn’t have been the type to engage in anti-social behavior).

    An open question would be whether reducing mutational load itself actually promotes a violent or psychopathic phenotype in the wider Homo genus (are less mutated people generally more psycho / dominant ?). I have no way of knowing whether this is the case or not.

  13. bruce says:

    Why can’t I try consensus averaging on my kids? I’d like superkids. I’d settle for a consensus-averaged No Lower Back Trouble and Don’t Die of Cancer like my daddy, sister, both granddads etc. If GK Chesterton is right and the eugenic evolved Superman kills us all for the insolence of fooling with his genes, hey, he’s my boy- no big.

    • Admirable attitude. To be fair, be certain of what you mean – 100% consensus averaging means they’d be “your” kids to about the same extent as they’d be mine or Gregs, genetically. Consensus averaging = removing traces of resemblance to the flawed gunk you carry around with you in your overheated sack. A bit further down the scale, 20% or 50% or 80% averaging, if you fx. average only the variants which are lowest frequency, might mean a synth which is both highly awesome, and has similarity to you.

      Even more detailed: a few of your non-refseq variants are likely to be “rare but good”, which you’d want to keep. In the (far) future, bioinformatics will be smart enough that you can pick and choose.

      Some love the idea of liberating a child from their own quirks and flaws. Some feel that this makes it less “their” child. I can’t see how either flavor of monkey logic is more or less objectively sane. I myself lean towards “whatever makes a good child”.

  14. The fourth doorman of the apocalypse says:

    Any comments on this paper?

    Variable NK cell receptors and their MHC class I ligands in immunity, reproduction and human evolution

    Abstract | Natural killer (NK) cells have roles in immunity and reproduction that are
    controlled by variable receptors that recognize MHC class I molecules. The variable
    NK cell receptors found in humans are specific to simian primates, in which they
    have progressively co‐evolved with MHC class I molecules. The emergence of the
    MHC‐C gene in hominids drove the evolution of a system of NK cell receptors for
    MHC-C molecules that is most elaborate in chimpanzees. By contrast, the human
    system of MHC-C receptors seems to have been subject to different selection
    pressures that have acted in competition on the immunological and reproductive
    functions of MHC class I molecules. We suggest that this compromise facilitated
    the development of the bigger brains that enabled archaic and modern humans to
    migrate out of Africa and populate other continents.

  15. AllenM says:

    Well, one further questions would be what kind of mitochondria set would you use? After all, if you just build a proto neanderthal 46 set, what about all of the rest of the supporting structure? Further, what about immunity from the mother- or would you be able to do an artificial womb? If you used a human surrogate, what kind of carrying incompatibility could result? Placental barrier questions, immunity questions, too many questions.

    This might be the question for the 22nd century

  16. dave chamberlin says:

    The first catch I can think of is there isn’t three high quality neanderthal genomes and there never will be so you wouldn’t catch all the data errors. With humans the risk of spontaneous abortion is 33%, with this attempt the risk would be very very close to 100%. Genetics isn’t anywhere near being able to cut and paste all the data errors with modern DNA. Pretty cool idea trying consensus averaging on moderns though. Kind of reminds me of when they consensus average a face and the result is always very good looking thanks to excellent symmetry.

  17. jb says:

    OK, folks, we have the answer — the problem is that the resurrected Neanderthals would sue us into oblivion!

    http://takimag.com/article/thawing_out_the_neanderthals_gregory_cochran

    ROTFL!!! 🙂

    • erica says:

      jb, thanks for the link.

      Bravo, good stuff, on many levels.
      I hope the next book is not far around the corner.

    • Greg: Gonzo Geneticist

      • ave chamberlind says:

        I like that. If we could bring back neanderthals we would hand them some worthless land land and the profits from some casinos. Northern Canada fits the bill. Go out in a snow mobile bus during the day to watch them chuck spears at the caraboo and of course the mamouths (no neanderthal theme park would be complete without those hairy beasts) and gamble by night.

  18. aericksoncornish says:

    One way to genetically engineer, say, super-smart babies would be first to find a large enough number of reliable ‘hits’ on IQ to account for a decent fraction of additive habitability, then create large numbers of fertilized embryos in vitro, sequence them, and implant the one with the highest number of favorable alleles, which practically probably means the one with the smallest number of rare deleterious alleles. No need to know the actual biological mechanisms by which certain versions of genes are more favorable to IQ than others: this is engineering, and correlations will work just fine. Greg seems to be suggesting that another way to make super babies, not just in terms of a single trait, but in general, would be synthesizing DNA with sequences chosen by consensus averaging or another mechanism to weed out rare deleterious alleles, then inserting this DNA in an embryo and bringing it to birth. Presumably, with many more genomes to work with than three (and complete at that), you could do quite a lot to minimize genetic load, with all the concomitant benefits that come with it. I think “spell-checking the genome” has been proposed here as a plausible future method for genetically engineering people to favor various traits and increase fitness, and I think he is implying this consensus averaging of Neanderthal genomes would have a similar effect, although in a slightly different, extinct species. I did not read many of the comments though, so I probably missed something or am just repeating what has been said.

  19. Greying Wanderer says:

    “The fact that Neanderthals had large expensive brains proves that being smart had adaptive value to them, symbol manipulation or not.”

    Something i wondered was if larger skulls was like a quick fix for higher IQ i.e. say a small-skulled population moving out of the tropics needed a higher average IQ to survive then that could be easily accomplished – assuming the population’s skull-size was on a bell curve by the outliers on the large-skull end surviving and the others not.

    This could then be repeated as the population spread north leading to the observed pattern of larger skull-sizes with latitude.

    However a larger skull and brain size has a cost (and presumably an upper limit) so wouldn’t evolving a more efficient brain ultimately win out?

    I think that would explain the extreme northern latitude hunter-gatherers, large-skulled, higher IQ than more southern hunter-gatherers but in an environment where specialization – perhaps the impetus for more efficient brains? – isn’t possible so they only have one of the necessary components.

    In a nutshell i don’t think an average cloned neanderthal woiud be super-smart. I think their average would be similar to Eskimo (which would have been super-smart once).

  20. Sutro Helm says:

    Ask yourselves this:
    How many neanderthals would it take to change a light bulb? Extrapolate THAT!

  21. Pingback: Reinventing Humanity – A New World Order Wet Dream | Farm Wars

  22. Pingback: 501 (c)(3) : The Adventure Begins | West Hunter

  23. Sam Neumann says:

    Surely, any genetic sampling of a fossil includes material from many cells from that one individual and not just one cell. Assuming that the cells had the same genome and that any mistakes/degradation are random then a sufficient sample size fossil material followed by polling (tell me n times) would eliminate the errors completely without resorting to comparisons with other individuals.

  24. snek2020 says:

    The biggest problem is that even if the synthesis succeeds, you will have created yet another inferior race to demand gibs from whitey. There are already far too many of those. That’s why neanderthals should remain extinct. Most of the selection for higher intelligence occurred after Neanderthals went extinct, so it’s very unlikely that even with spellcheck the product would be particularly intelligent. The high brain volume probably just indicates a high spatial IQ for hunting, but they probably suck at everything else, like the Eskimos.

Leave a comment