Skip to main content

Plastination versus Cryonics

Break down survival as Drake equation, see how plastination differs from cryonics, try to calculate advantage

No man is an iland, intire of it selfe; every man is a peece of the Continent, a part of the maine; if a clod bee washed away by the Sea, Europe is the lesse, as well as if a Promontorie were, as well as if a Mannor of thy friends or of thine owne were; any mans death diminishes me, because I am involved in Mankinde; And therefore never send to know for whom the bell tolls; It tolls for thee….

John Donne, Meditation 17

The Drake equation for cryonics is just a number of sequential steps with independent probabilities, all of which must succeed but otherwise none more important than the others. (One software version is the Cryonics Calculator.) Web developers call it a conversion funnel. Specific equations and values have been proposed, usually yielding probability of success 0<x < 10%. For example, Steven Harris in 1989 estimated 0.2-15%, R. Mike Perry in the same article runs a different analysis to arrive at 13-77%, Ralph Merkle suggests >85% (conditional on things like good preservation, no dystopia, and nanotech); Robin Hanson calculated in 2009 a ~6% chance, Roko gave 23%; Mike Darwin in 2011 (personal communication) put the odds at <10%; an informal survey of >6 people (LW discussion) averaged ~17% success rate; Jeff Kaufman in 2011 provides a calculator with suggested values yielding 0.2%; The 2012 LessWrong survey yields a mean estimate of cryonics working of 18% (n = 1100) and among ‘veterans’ the estimate is a lower 12% (n = 59) - but interestingly, they seem to be more likely to be signed up for cryonics.


One such Drake equation might break out the steps as follows:

  1. Likelihood of getting preserved

  2. * preservation contains needed information

  3. * information’s survival over the centuries until revival possible

  4. * existence of organizations or entities arranging revival

  5. * the actual revival

With those 5 values, one multiplies to get the final probability of each step coming true and hence of a successful revival. Because each step is multiplied together with no weights, improvements are equal - an improvement in one factor is as good as the same improvement in another factor: a 10% improvement in organizational continuity is as good as a 10% improvement in the odds that the vitrification preserves necessary information, which is as good as a 10% improvement in odds that revival tech will be developed. This also holds for balancing profit and loss (it’s all the same). A technology that increases the organizational parameter by 30% and decreases the information preservation parameter by 10% would be a net gain, because the gain in one step outweighs the loss in another, regardless of what concrete values one assigns. (For example, if X was 50% and Y was 60% for a final chance of 30%, then you would be better off if you could do something different where X was 80% and Y was 40% because that yields a final chance of 32%. This would be easier to see in different notation like log odds.)


Biological samples have been accidentally preserved from the deep past through dehydration, freezing, anoxia, and chemical preservation; ancient DNA has (possibly) been recovered from 250 million year old salt crystals, 23 million year old insects are classifiable and preserved in high fidelity, and ice samples have preserved 800,000 year old and 400,000 year old DNA. Some recovery has been accomplished of 400,000 year old hominid DNA, 45,000 year old human and 38,000 year old Neanderthal DNA has been partially recovered & sequenced, as has 28,000 year old woolly mammoth DNA, 80,000 year old hominid DNA, and 700,000 year old horse DNA. 30,000 year old frozen plant tissue has been grown into healthy adult plants. One 4000 year old human genome was sequenced. A 2700 year old human brain has been recovered from a waterlogged English pit, heavily damaged but visibly still a brain; it is one of hundreds of brains recovered from watery environments. Many of the samples chemically preserved in amber turned out to be contaminated mistakes, but nevertheless, the preservation is very good and down to the cellular level:

Examination of the ultrastructure of preserved tissue in the abdomen of a fossil fly (Mycetophilidae Diptera) entombed in Baltic amber revealed recognizable cell organelles. Structures that corresponded to muscle fibers, nuclei, ribosomes, lipid droplets, endoplasmic reticulum, and mitochondria were identified with the transmission electron microscope. Preservation was attributed to inert dehydration as well as the presence of compounds in the original sap which functioned as natural fixatives. This evidence of cell organelles in fossilized soft tissues represent an extreme form of mummification since Baltic amber is considered to have formed about 40 million years ago.

(Even the color in dinosaur feathers has been preserved in amber.) Ben Best describes amber’s preservative mechanism in “Ancient DNA & Preservation in Amber”:

“Tree sap (resin) contains sugars as well as alcohols & aldehydes (including terpenes), which are dehydrating & antibiotic as well as providing an air-tight seal to prevent further entry of oxygen. Myrrh is a mixture of resin, gum and essential oils from the Commiphora plant that was used by the ancient Egyptians for embalming (by pouring it into the cranial, chest, abdominal and pelvic cavities) and mummification (by soaking the wrapping bandages in it)….Amber, as a sticky pitch from certain trees, can trap insects when fresh from a tree-wound. The sugars, alcohols & terpene-aldehydes diffuse into the insect to dehydrate & preserve. The amber surrounds the insect, providing an air-tight seal. Further oxidation & polymerization of the terpenes protect the insect from further damage. The continued polymerization of the amber terpenes eventually results in an insoluble gemstone-quality glass that preserves the insect in a strong encasement. Although such fortuitous combination of chemical preservation and oxygen-tight encasement should not be expected for preservation of large specimens (like humans or dinosaurs), the use of some hardened plastic or resin encasement could assist chemical and/or dehydration preservation.”

Plastination and chemical brain preservation have been seriously proposed1 as an alternative to cryonics, apparently first by Charles Olson in “A Possible Cure for Death”. R. Mike Perry discusses it favorably in “The Road Less Traveled: Alternatives to Cryonics”. Greg Jordan says no convincing counter-arguments have been raised since Olson and strongly approves of it in his post “Biostasis through chemopreservation”.

Is plastination a net gain?


Advantage for plastination:

  1. Improves survival parameter #3: It is probable that scanning technology will outstrip upload technology. In many fields, the ability to gather data exceeds the ability to process or understand it. Hence, it is possible and quite likely that during the long wait for revival, it will become possible to scan a plastinated brain in sufficient resolution to eventually upload it.

    Even if the scan were destructive, such a scan would make it possible to drastically increase survival odds by copying the digital data to many archives and formats online and offline. No such option is available to a cryonics brain unless it abandons cryonics entirely, in which case why did it take the risk of the cryonics failing & it warming up rather than be plastinated from the beginning? It’s hard to imagine the benefits being so equally balanced that the actualization of better scanning would is enough to change the plans - given how many parameters there are, a ‘pure’ strategy of 100% cryonics or 100% plastination will win. (Indeed, one might wonder how one would know that a plastination+scanning procedure was good enough for uploading in the absence of a successful human upload. Human biology often diverges from even close animal models, and shouldn’t we expect things like consciousness to be even less reliably modeled by those animal models? The window between the first successful upload and widespread uploading will be short compared to the time between now and then, even if you assume no Singularity of any kind, not even Robin Hanson’s Crack of a Future Dawn, and a slowed-down Moore’s law.)

  2. Improves organizational parameter #4: Plastination may be such a technology. It does not require organizational continuity; one rough year and your brain is a pile of rotting maggots with cryonics. one rough year with plastination, and your brain is a bit dusty2. A plastinated brain doesn’t even need an organization: it may be preserved as a time capsule, a family heirloom, a curiosity, or perhaps just buried somewhere; but a cryogenically stored brain must have a sophisticated support system which will supply it regularly with liquid nitrogen, and that rules out pretty much everyone but a cryonics organization. Mike Darwin has been a real wake up call - the Outside View says ALCOR and CI are much more risky than usually assumed3 - and indeed, one cryonics organization has already failed with the loss of patients. Past the century mark, a few percentage points is the optimistic estimate! Cryonics organizations have done reasonably well so far, but ALCOR consistently runs at a loss and if membership does not follow an exponential growth (as it does not), then relatively soon the ratio of dead members to live members will start getting worse.

  3. Improves likelihood of preservation #1: Much cheaper than vitrification; while cryogenic storage is very cheap in scale the cost is still non-trivial for the foreseeable future.

  4. Improves revival parameter #5:

    1. despite being a relatively young field (albeit respectable & well-funded), plastination & scanning has made tremendous progress and is slowly being automated, with one human brain sliced at 70 micro thickness and photographed4, or producing partial connectomes5 of brains. One might characterize the two fields as: connectome:upload::revived-rabbit-kidney:functioning-brain, and ponder the following possibilities:

      • (=) If one regards the ‘distance’ between the state-of-the-art and the goal as equal, then plastination’s faster progress is a win

      • (<) If one regards the distance as smaller for plastination than cryonics, then plastination wins both on faster progress and how much is left to do

      • (>) Only if one regards the kidney as being much closer to a reviving a functioning brain can it be possible for cryonics revival to beat plastination revival.

      Pondering the Roadmap and the Blue Brain project, I strongly doubt the kidney-brain is much closer together than connectome-upload, and suspect that the latter is closer.6

    2. If plastination turns out to be the ‘right’ starting point for an upload and cryonics brains must be plastinated first, we might expect the cryonics->plastination process to be more lossy than recently-deceased-brain->plastination process. It could be that warming the brain up enough to plastinate does damage, or that the cracks caused by vitrification are not reparable and degrade the plastinated result.



  1. Threatens information preservation parameter #2 in several ways:

    1. can plastination preserve the level of detail required for reconstruction? Unknown.7 The Brain Preservation Technology Prize (to which I have donated) is attempting to spark research.

      Cryonics assumes, based on analogous near-death experiences, that many things like dynamic electrical activity, can be disregarded for the purpose of personal identity. Plastination is known to preserve overall neural structure in high resolution, as evidenced by current plastination techniques sufficing to create connectomes, but what does it miss? It misses the dynamic activity, like cryonics, but cryonics preserves things plastination may not. Does plastination preserve neurotransmitter levels? (It seems inconsistent with the general idea of plastination.) Neurotransmitter levels change endlessly, but levels of neurotransmitters can be the difference between sanity and insanity in the living; on the other hand, personal identities persist even through careers of massive head trauma like boxing or football, which affect neurotransmitters (see Fencing response). What might we be missing?

    2. are the methods well-studied and implemented even if they are capable in principle of preserving the necessary information? They have been widely used in neuroscience, but there are no checks or ‘round trips’ showing that information and functionality is preserved with normally executed techniques - at least cryonics has frozen rabbit kidneys to test itself on, what does plastination have?

      Counter-point: brain scanning and the associated plastination techniques are an extremely hot field of research, which is improving at an amazing clip akin to DNA sequencing. This ought to give us considerable confidence in its current and future techniques. (This also raises an interesting point that anyone not dying in the next decade or two is wasting their time by investigating plastination. It’s entirely possible that for a young or middle-aged person, the field will either have succeeded in plastinating an animal or human brain and then uploading it, or will have dead-ended and the fundamental limits discovered, by the time they truly need to choose between cryonics and plastination.)

    3. Are the plastination processes fast enough? Normal brains are preserved over weeks to years, which is strictly worse than a hypothetical equally good process which requires hours. Cryogenic cooling appears to be intrinsically faster than chemical diffusion and action. How much damage does the extra time required do? (There’s some weak evidence that the rate of degradation is somewhat constant and hence the damage linear over time.)

  2. Threatens revival parameter #5: a vitrified brain can, presumably, be plasticized if necessary. However, a plasticized brain is permanently plasticized. The plasticized brain has only 1 option. A vitrified brain has 2 options: normal freezing and repair (whatever that will be), and the plasticized route (scanning and upload, likely). A disjunction of two probabilities is at least as likely as either disjunct. Ease of revival also affects how long storage must succeed - if revival is feasible for both, but cryonics is easier, the cryonics brain will have to last a shorter period than the plastinated brains. (This cuts both ways: if plastinated brains are easier to revive or upload, then it will be the cryonics brains which lose some probability due to the increased wait-time.)

    This may not be a large advantage for cryonics. Most cryonics advocates seem to expect uploading will be the ultimate solution, inasmuch as brain scanning is advancing a lot faster than medical nanotechnology (see the Whole Brain Emulation Roadmap), but there’s still a small probability that a non-upload organic solution will be used, and this small probability is forfeited in the plastination route.


Counting the discrete items, we found 4 for plastination and 2 against (yes, one point is counted twice). This is a useless count, of course. Of those favorable 5, 2 seem to me to be probability differentials of magnitude. Of the unfavorable 3, 1 seems to be of magnitude. This count favors plastination as well.

I believe the above fairly sets out the signs of all the relationships, but it is difficult to fill in specific numbers for oneself, and even more difficult to defend those numbers.

The fundamental question is, does the rapid advance of scanning and the robustness against organizational failure of plastination outweigh the risk that cryonics uniquely preserves key information?



Aschwin de Wolf2013:

Plastination is one-way, while with proper techniques, the brain can be cryonically stored such that it can later be plastinated (in case of an extended emergency eg.); Mike Darwin did some preliminary experiments in this area and forecasts what such techniques might one day look like:

One of most difficult problems to be overcome when applying this technique to a whole organ the size of a human brain is, how do you keep the circulatory system accessible to allow for the replacement of the water in the tissue with the monomer that will subsequently be polymerized into a solid plastic, and to remove the truly enormous amount of heat liberated by the exothermic polymerization reaction?

[Figure 14: A corrosion cast of the circulatory system of the human brain. The extensive vascularization of the brain allows for use of the circulatory system as both a mass and heat exchanger. Gas perfusion of the circulatory system prior to cooling to vitrification temperatures leave it accessible during cryogenic storage should fixation and plastination become necessary as a fallback position to cryopreservation.]

This slide (Figure 14) shows the circulatory system of a human brain. This is the real deal, not a model. What you are looking at is something called a corrosion cast. In this case, the arterial circulation of a human brain was injected with a red-tinted plastic material and the brain was then immersed in a strong base, such as a concentrated solution of sodium hydroxide. The base dissolves or corrodes the tissue away, leaving behind the red plastic framework of the arterial circulation. Its easy to see that the human brain is a strongly circulated organ in fact, the brain normally received 1/3rd of the resting cardiac output about 1.5 liters of blood per minute. The FFP researchers decided that the best way to achieve both heat and mass exchange was to keep the brains circulatory open and accessible throughout the procedure. In order to achieve this during solidification of the brain, they turned to gas perfusion replacing the liquid in the circulatory system with gas.

One of the investigators (Mike Darwin) realized that if the circulatory system of human cryonics patients was similarly perfused with gas during cooling to vitrification, not only would cooling be hastened, thus reducing the risk of freezing, but the circulatory system of the patient would remain accessible, even during storage at -150C. What this meant was that it would thus be theoretically possible fix and plastinate cryonics patients in the event that cryopreservation was no longer possible.

In this scenario, a patient would be removed from storage to a special apparatus, the Final Fallback Position System (FFPS), where his arterial circulation would be connected to a recirculating system of solvent chilled to -100C. This solvent would be pumped through the patient and would begin dissolving the viscous cryoprotectant-water solution in the patients tissues. The solvent would also contain fixative initially formaldehyde to fix the proteins and, finally, a highly reactive metal, osmium tetroxide, that is necessary to fix the lipids; which comprise both the cellular and the intracellular membranes. Once the patient had been solvent substituted and fixed in this fashion, it would then be possible to safely warm him up to room temperature and introduce the monomer required for plastination. In fact, if necessary, this could be done by immersion, rather by perfusion (though this would necessitate removal of the brain from the head).






“Electron imaging technology for whole brain neural circuit mapping”, Hayworth2012:

As mentioned above, the diameter of neuronal processes routinely shrink to less than 100 nm; for example, dendritic spine necks and fine axons can shrink down to ~40nm. With standard tissue fixation and embedding protocols the membranes of these tubular structures are made electron dense, hence to resolve the tubular nature of these neuronal processes one requires resolutions on the order of 10 nm or less. Today both the transmission electron microscope (TEM) and the scanning electron microscope (SEM) can easily achieve such resolution while providing a signal to noise ratio sufficient to trace the finest neuropil in osmium fixed, heavy metal stained tissue. However, until relatively recently only the TEM was used for tracing neuronal circuits. This reliance on TEM has been a major roadblock to attempts at large-scale automation since TEM requires that sections be physically cut thin enough (<100nm) for electrons to pass through and that they be mounted on gossamer thin plastic films throughout the imaging process. If scanning, as opposed to transmission, electron microscopy could be used it would open up many more possibilities for robust automation since then at least the thin sections could be collected and imaged on a thick sturdy substrate [Hayworth, 2008], but before the widespread introduction of high-brightness field emission electron sources (as opposed to tungsten thermionic sources) one simply could not achieve SEM electron probes of small enough diameter (5 nm) and high enough current to allow quality imaging of neural tissue [Bogner et al 2007; Joy, 1991].

In 2004, Denk and Horstman showed that this reliance on TEM for tracing neural circuits could in fact be overcome. They published a seminal paper demonstrating the SBFSEM (Serial Block Face Scanning Electron Microscopy) method which could robustly automate the process of obtaining a series of electron micrographs from a block of neural tissue [Denk and Horstman, 2004]. Using a high-brightness field emission SEM equipped with a low-energy backscatter electron detector they first showed that one could obtain high-resolution images directly from the face of the tissue block (thus eliminating the need to collect ultra-thin sections). Then, following an original design from Leighton [1981], they built an ultramicrotome into the vacuum chamber of the SEM which would repeatedly scrape (using an extremely sharp diamond knife) 50nm layers of material of the surface of the tissue block while the SEM was used to image each freshly revealed block face at high resolution. The result was a fully automated method to volume image a block of neural tissue. They and others have continued to refine this technique so that now it can now achieve 23nm section thickness and have successfully applied the technique to a reconstruction of the direction selective circuitry of the mammalian retina [Briggman et al 2011].

…In 2008, Graham Knott et al. introduced the FIBSEM (Focused Ion Beam Scanning Electron Microscopy) technique for tracing neural circuits [Knott et al 2008]. This technique is also known as Ion Abrasion Scanning Electron Microscopy [Heymann et al 2009]. FIBSEM works similarly to SBFSEM, but instead of physically scraping a thin layer of material off the block face with a diamond knife, the material is instead ablated away using a focused beam of gallium ions. This change overcomes the lateral resolution limitations of the SBFSEM since the FIB ablation process is relatively insensitive to the material properties of the embedding plastic and therefor a much larger electron dose can be used during imaging. What’s more, ion beams can be tightly focused achieving spot sizes in the 10 nm range. Aligning the ion beam parallel to the block face so that it just grazes the surface allows reliable and rapid removal of extremely thin layers less than 10 nm in thickness. What actually limits the resolution of the FIBSEM (for optimally sized blocks) is the depth of penetration of the imaging electrons into the surface of the block [Muller-Reichert et al 2010] and by using very low voltages (< 2 kV), along with sensitive low-energy backscatter electron detectors with energy filtering, recent reports [Knott and Cantoni, 2011] have demonstrated depth resolutions in the range of ~10nm with lateral imaging resolutions of 5x5 nm. This FIBSEM voxel resolution of 5x5x10nm is more than sufficient to reliably resolve the finest neuronal processes and synapses, and should be sufficient to allow for extremely reliable automated reconstruction algorithms [Lucchi et al 2010]. Recently Knott has published a Journal of Visualized Experiments online video article covering the entire FIBSEM procedure from tissue processing to automated FIBSEM milling and imaging, demonstrating a final volume image of a piece of cortex imaged with 5x5x5 nm voxels [Knott et al 2011]. It is stunning how clearly demarcated the neuronal processes and synapses are in the resulting FIBSEM volume video.

…In summary, the FIBSEM technique represents a truly enabling technology for mapping neural circuits with 100% reliability. It can achieve voxel resolutions of at least 5x5x10 nm, sufficient to allow straightforward algorithms for computer automated tracing of all neuronal processes and identification of all synaptic connections [Merchn-Prez et al 2009] along with the morphological parameters which are correlated with their strength (eg. area of contact and size of post synaptic density). Furthermore, its use of a focused ion beam (as opposed to traditional physical sectioning with a diamond knife) gives it the potential to achieve the very high reliability levels needed for large-scale automation. A serious limitation of today’s FIBSEM systems is that they can achieve such high-resolution images only over tiny volumes - typically on the order of a few tens of microns across. However, this limitation can be relatively easily overcome with the use of a lossless subdivision technique like the one described above. And the availability of such a lossless sub-division technique opens up the possibility of massively parallel FIBSEM imaging, something that would, in any case, be necessary in order to image any relatively large volume of tissue in a reasonable amount of time.

…Today’s most typically used tissue preparation protocols for electron microscopy are only able to prepare volumes of less than 1 mm3. An animal’s vascular system is perfused through the heart with a mixture of paraformaldehyde and glutaraldehyde. The paraformaldehyde quickly stops cellular degradation and, at a slightly slower rate, the glutaraldehyde provides stronger crosslinks to fix proteins in place. If performed carefully, such perfusion through the vascular system is able to quickly fix the entire nervous system of the animal since every cell is within a few tens of microns of a capillary. However, following this fixative perfusion step the brain of the animal is removed and a very small piece dissected to undergo the remaining tissue preparation steps - which are typically all performed by simple immersion in chemicals. These steps include immersion in osmium tetroxide (to fix membrane lipid molecules in place), immersion in heavy metal staining solutions (eg. uranyl acetate), immersion in a graded series of alcohols (to remove water from the tissue), and finally immersion in a plastic resin dissolved in an organic solvent (to completely infiltrate the tissue with the heat-curable resin). Because these steps are performed by simple immersion (and thus diffusion) the process fails if attempted on blocks larger than 1 mm3. The result is destruction of tissue ultrastructure and poor staining in the depths of the block.

Obviously such volume limitations must be overcome before human mind uploading can be attempted. Although it has never been demonstrated that a whole mammalian brain can be preserved at the ultrastructure level for electron microscopic imaging, there are many results that suggest that a protocol could be developed to do just that. Perfusion fixation with osmium tetroxide has been demonstrated on whole brains [Palay et al 1962], and serial vascular perfusion first with glutaraldehyde, followed by osmium tetroxide and uranyl acetate, and finally by an alcohol dehydration series has been demonstrated on whole organs [Bachofen et al 1982; Oldmixon et al 1985] demonstrating sufficient preservation to allow ultrastructure studies. Plastic infiltration of whole mouse brains has also been demonstrated [Mayerich et al 2008], and there are recent reports of the development of a full ultrastructure fixation, staining, and embedding protocol for the mouse brain for use in the mapping of long distance axon trajectories via serial block face SEM [Mikula et al 2011]. There is even a challenge prize being offered for the first demonstration of such ultrastructure preservation across an entire large mammalian brain [Hayworth, 2011]. Given these results and the fact that the burgeoning field of Connectomics will require larger and larger volumes of high-quality prepared tissue, it is likely that a protocol to preserve and plastic-embed an entire human brain at the ultrastructure level will be perfected relatively soon.









  1. As far as I can tell, pretty much every pro-con for plastination vs cryonics applies to chemical fixation with the exception of the lipids, so in the following I treat them as synonymous.↩︎

  2. Apparently it’s best to store even a plasticized brain in cryogenic storage; Jordan Sparks says “If they don’t transition to cryopreservation, damage will be ongoing for decades.”↩︎

  3. “The Armories of the Latter Day Laputas, Part 5” has the statistics. Besides 98% of startups dying, established companies die frequently: “…if we re-set the graph at 5 years, and then follow the remaining cohort of enterprises out to the 10 year mark, the mortality rate is still quite high with only 29% of businesses surviving.” The total mortality is considerable: “Thus, the chances of a business entity (excluding religious and academic institutions) surviving for >100 years is 1.096%” One sometimes see people provide their own values for a cryonics Drake equation - often the result is a comforting 1-5%. This shows they expect to be revived very soon, think cryonics organizations are exempt from these statistics, or are unaware. Being non-profit helps only a little: “However, by the 30 year mark, ~95% of NPOs have failed.”↩︎

  4. The brain of the patient “HM”, processed by The Brain Observatory of UC San Diego.↩︎

  5. Although hopefully in ways more efficient than a few professors and technicians laboring together or yoking dozens of students! In this vein, the Mouse Brain Architecture Project is something to watch - what can be done to a mouse brain may, a few generations later, be done to a human brain.↩︎

  6. I have a general skepticism about applied medicine and biology (and revivification is very applied), from a lifetime of broken promises and failed predictions about the coming fruits of medicine and biology. I recently ran across an example in a Tanner lecture by Donald Brown, then an accomplished professor of biology at Johns Hopkins:

    We don’t know yet how these genes work, but techniques of modern genetics which led to the discovery of these genes in the first place will provide these insights. In the next ten years we are going to understand better and perhaps even cure some of the most serious diseases that afflict mankind, such as diabetes, arteriosclerosis, parasitic diseases, the common cold, cystic fibrosis, certain kinds of arthritis, immune diseases, and infectious diseases, just to name a few. A molecular basis for at least some kinds of schizophrenia will be found. We will learn about the biochemistry of the aging process, which also has a strong genetic component. This doesn’t guarantee prolongation of life, but rather an improvement of the quality of life in old age. We need sensitive assays for the effects of chemicals, pollutants, and drugs as causative agents of birth defects like those developed to determine carcinogenic potential. There have yet to be developed simple, safe, and reversible contraceptives for males.

    This reads like it was written yesterday, and not in 1984 - more than 27 years ago. The predictions about the Human Genome Project have fared little better. My rule of thumb is that in the future, our understanding and information will outstrip our laboratory prototypes by even more than today, and the prototypes as much outstrip the generally available products; hence, I am unsurprised by the astounding progress in DNA sequencing, and equally unsurprised by the astounding dearth of new drugs and treatments. Fortunately for plastinated brains, enough information and understanding can make up for stasis in other areas; as long as computers and scanning technology continue to advance, things may yet work out for them.↩︎

  7. From Perry’s article about chemical fixation:

    As with cryonics itself, the basic answers are unknown. Some encouragement is provided by the high level of detail seen in preserved brain samples using, for example, formaldehyde fixation. Ultrastructural details under the high magnification of electron microscopy (10,000x plus) are quite clear, though this alone is not a demonstration that all the details one would like are present. However, the same problem exists with tissue preserved cryogenically the answer to whether the preservation captures fine enough details is unknown though there are at least some encouraging signs along with reasons for concern.

    He is not so sanguine about plastination;

    A possible drawback of this approach, from the standpoint of preserving the fine structures that are especially important from a cryonics standpoint, is the relatively harsh regimen needed to produce the finished product. Typically, the process starts with an aldehyde-fixed specimen in aqueous solution. The specimen is placed in acetone, and successive changes of the bath remove water and fats. Finally the resin monomer is introduced, the remaining acetone is removed by vacuum, and induced catalysis yields the desired polymerization. Concerns have been raised about whether defatting would obliterate important brain information, though there does not appear to be strong evidence of this. (Here it is appropriate to mention that lipids nevertheless could contain important information; preservation of lipids is a difficult process that has not been covered in this preliminary survey but deserves consideration.)

    Olson mentions lipids in passing:

    A second example of redundancy involves the locations of the neuronal membranes (ie. the neuronal configuration). The information of the membranes’ positions is contained not only in the physical positions of the membrane lipids, but also in the cellular cytoskeleton (which is made of proteins) whose purpose is, among other things, to hold the membrane in its configuration (15). Thus, even if a substantial proportion of lipids were extracted in the course of chemical preservation of a brain (as is the case with some preservation techniques), it is plausible that the information of the neuronal configuration would still exist in the crosslinked cytoskeletons of the neurons.


Similar Links

[Similar links by topic]