Econstudentlog

The Laws of Thermodynamics

Here’s a relevant 60 symbols video with Mike Merrifield. Below a few observations from the book, and some links.

“Among the hundreds of laws that describe the universe, there lurks a mighty handful. These are the laws of thermodynamics, which summarize the properties of energy and its transformation from one form to another. […] The mighty handful consists of four laws, with the numbering starting inconveniently at zero and ending at three. The first two laws (the ‘zeroth’ and the ‘first’) introduce two familiar but nevertheless enigmatic properties, the temperature and the energy. The third of the four (the ‘second law’) introduces what many take to be an even more elusive property, the entropy […] The second law is one of the all-time great laws of science […]. The fourth of the laws (the ‘third law’) has a more technical role, but rounds out the structure of the subject and both enables and foils its applications.”

Classical thermodynamics is the part of thermodynamics that emerged during the nineteenth century before everyone was fully convinced about the reality of atoms, and concerns relationships between bulk properties. You can do classical thermodynamics even if you don’t believe in atoms. Towards the end of the nineteenth century, when most scientists accepted that atoms were real and not just an accounting device, there emerged the version of thermodynamics called statistical thermodynamics, which sought to account for the bulk properties of matter in terms of its constituent atoms. The ‘statistical’ part of the name comes from the fact that in the discussion of bulk properties we don’t need to think about the behaviour of individual atoms but we do need to think about the average behaviour of myriad atoms. […] In short, whereas dynamics deals with the behaviour of individual bodies, thermodynamics deals with the average behaviour of vast numbers of them.”

“In everyday language, heat is both a noun and a verb. Heat flows; we heat. In thermodynamics heat is not an entity or even a form of energy: heat is a mode of transfer of energy. It is not a form of energy, or a fluid of some kind, or anything of any kind. Heat is the transfer of energy by virtue of a temperature difference. Heat is the name of a process, not the name of an entity.”

“The supply of 1J of energy as heat to 1 g of water results in an increase in temperature of about 0.2°C. Substances with a high heat capacity (water is an example) require a larger amount of heat to bring about a given rise in temperature than those with a small heat capacity (air is an example). In formal thermodynamics, the conditions under which heating takes place must be specified. For instance, if the heating takes place under conditions of constant pressure with the sample free to expand, then some of the energy supplied as heat goes into expanding the sample and therefore to doing work. Less energy remains in the sample, so its temperature rises less than when it is constrained to have a constant volume, and therefore we report that its heat capacity is higher. The difference between heat capacities of a system at constant volume and at constant pressure is of most practical significance for gases, which undergo large changes in volume as they are heated in vessels that are able to expand.”

“Heat capacities vary with temperature. An important experimental observation […] is that the heat capacity of every substance falls to zero when the temperature is reduced towards absolute zero (T = 0). A very small heat capacity implies that even a tiny transfer of heat to a system results in a significant rise in temperature, which is one of the problems associated with achieving very low temperatures when even a small leakage of heat into a sample can have a serious effect on the temperature”.

“A crude restatement of Clausius’s statement is that refrigerators don’t work unless you turn them on.”

“The Gibbs energy is of the greatest importance in chemistry and in the field of bioenergetics, the study of energy utilization in biology. Most processes in chemistry and biology occur at constant temperature and pressure, and so to decide whether they are spontaneous and able to produce non-expansion work we need to consider the Gibbs energy. […] Our bodies live off Gibbs energy. Many of the processes that constitute life are non-spontaneous reactions, which is why we decompose and putrefy when we die and these life-sustaining reactions no longer continue. […] In biology a very important ‘heavy weight’ reaction involves the molecule adenosine triphosphate (ATP). […] When a terminal phosphate group is snipped off by reaction with water […], to form adenosine diphosphate (ADP), there is a substantial decrease in Gibbs energy, arising in part from the increase in entropy when the group is liberated from the chain. Enzymes in the body make use of this change in Gibbs energy […] to bring about the linking of amino acids, and gradually build a protein molecule. It takes the effort of about three ATP molecules to link two amino acids together, so the construction of a typical protein of about 150 amino acid groups needs the energy released by about 450 ATP molecules. […] The ADP molecules, the husks of dead ATP molecules, are too valuable just to discard. They are converted back into ATP molecules by coupling to reactions that release even more Gibbs energy […] and which reattach a phosphate group to each one. These heavy-weight reactions are the reactions of metabolism of the food that we need to ingest regularly.”

Links of interest below – the stuff covered in the links is the sort of stuff covered in this book:

Laws of thermodynamics (article includes links to many other articles of interest, including links to each of the laws mentioned above).
System concepts.
Intensive and extensive properties.
Mechanical equilibrium.
Thermal equilibrium.
Diathermal wall.
Thermodynamic temperature.
Thermodynamic beta.
Ludwig Boltzmann.
Boltzmann constant.
Maxwell–Boltzmann distribution.
Conservation of energy.
Work (physics).
Internal energy.
Heat (physics).
Microscopic view of heat.
Reversible process (thermodynamics).
Carnot’s theorem.
Enthalpy.
Fluctuation-dissipation theorem.
Noether’s theorem.
Entropy.
Thermal efficiency.
Rudolf Clausius.
Spontaneous process.
Residual entropy.
Heat engine.
Coefficient of performance.
Helmholtz free energy.
Gibbs free energy.
Phase transition.
Chemical equilibrium.
Superconductivity.
Superfluidity.
Absolute zero.

Advertisements

February 5, 2017 Posted by | Biology, Books, Chemistry, Physics | Leave a comment

Galaxies

I have added some observations from the book below, as well as some links covering people/ideas/stuff discussed/mentioned in the book.

“On average, out of every 100 newly born star systems, 60 are binaries and 40 are triples. Solitary stars like the Sun are later ejected from triple systems formed in this way.”

“…any object will become a black hole if it is sufficiently compressed. For any mass, there is a critical radius, called the Schwarzschild radius, for which this occurs. For the Sun, the Schwarzschild radius is just under 3 km; for the Earth, it is just under 1 cm. In either case, if the entire mass of the object were squeezed within the appropriate Schwarzschild radius it would become a black hole.”

“It only became possible to study the centre of our Galaxy when radio telescopes and other instruments that do not rely on visible light became available. There is a great deal of dust in the plane of the Milky Way […] This blocks out visible light. But longer wavelengths penetrate the dust more easily. That is why sunsets are red – short wavelength (blue) light is scattered out of the line of sight by dust in the atmosphere, while the longer wavelength red light gets through to your eyes. So our understanding of the galactic centre is largely based on infrared and radio observations.”

“there is strong evidence that the Milky Way Galaxy is a completely ordinary disc galaxy, a typical representative of its class. Since that is the case, it means that we can confidently use our inside knowledge of the structure and evolution of our own Galaxy, based on close-up observations, to help our understanding of the origin and nature of disc galaxies in general. We do not occupy a special place in the Universe; but this was only finally established at the end of the 20th century. […] in the decades following Hubble’s first measurements of the cosmological distance scale, the Milky Way still seemed like a special place. Hubble’s calculation of the distance scale implied that other galaxies are relatively close to our Galaxy, and so they would not have to be very big to appear as large as they do on the sky; the Milky Way seemed to be by far the largest galaxy in the Universe. We now know that Hubble was wrong. […] the value he initially found for the Hubble Constant was about seven times bigger than the value accepted today. In other words, all the extragalactic distances Hubble inferred were seven times too small. But this was not realized overnight. The cosmological distance scale was only revised slowly, over many decades, as observations improved and one error after another was corrected. […] The importance of determining the cosmological distance scale accurately, more than half a century after Hubble’s pioneering work, was still so great that it was a primary justification for the existence of the Hubble Space Telescope (HST).”

“The key point to grasp […] is that the expansion described by [Einstein’s] equations is an expansion of space as time passes. The cosmological redshift is not a Doppler effect caused by galaxies moving outward through space, as if fleeing from the site of some great explosion, but occurs because the space between the galaxies is stretching. So the spaces between galaxies increase while light is on its way from one galaxy to another. This stretches the light waves to longer wavelengths, which means shifting them towards the red end of the spectrum. […] The second key point about the universal expansion is that it does not have a centre. There is nothing special about the fact that we observe galaxies receding with redshifts proportional to their distances from the Milky Way. […] whichever galaxy you happen to be sitting in, you will see the same thing – redshift proportional to distance.”

“The age of the Universe is determined by studying some of the largest things in the Universe, clusters of galaxies, and analysing their behaviour using the general theory of relativity. Our understanding of how stars work, from which we calculate their ages, comes from studying some of the smallest things in the Universe, the nuclei of atoms, and using the other great theory of 20th-century physics, quantum mechanics, to calculate how nuclei fuse with one another to release the energy that keeps stars shining. The fact that the two ages agree with one another, and that the ages of the oldest stars are just a little bit less than the age of the Universe, is one of the most compelling reasons to think that the whole of 20th-century physics works and provides a good description of the world around us, from the very small scale to the very large scale.”

“Planets are small objects orbiting a large central mass, and the gravity of the Sun dominates their motion. Because of this, the speed with which a planet moves […] is inversely proportional to the square of its distance from the centre of the Solar System. Jupiter is farther from the Sun than we are, so it moves more slowly in its orbit than the Earth, as well as having a larger orbit. But all the stars in the disc of a galaxy move at the same speed. Stars farther out from the centre still have bigger orbits, so they still take longer to complete one circuit of the galaxy. But they are all travelling at essentially the same orbital speed through space.”

“The importance of studying objects at great distances across the Universe is that when we look at an object that is, say, 10 billion light years away, we see it by light which left it 10 billion years ago. This is the ‘look back time’, and it means that telescopes are in a sense time machines, showing us what the Universe was like when it was younger. The light from a distant galaxy is old, in the sense that it has been a long time on its journey; but the galaxy we see using that light is a young galaxy. […] For distant objects, because light has taken a long time on its journey to us, the Universe has expanded significantly while the light was on its way. […] This raises problems defining exactly what you mean by the ‘present distance’ to a remote galaxy”

“Among the many advantages that photographic and electronic recording methods have over the human eye, the most fundamental is that the longer they look, the more they see. Human eyes essentially give us a real-time view of our surroundings, and allow us to see things – such as stars – that are brighter than a certain limit. If an object is too faint to see, once your eyes have adapted to the dark no amount of staring in its direction will make it visible. But the detectors attached to modern telescopes keep on adding up the light from faint sources as long as they are pointing at them. A longer exposure will reveal fainter objects than a short exposure does, as the photons (particles of light) from the source fall on the detector one by one and the total gradually grows.”

“Nobody can be quite sure where the supermassive black holes at the hearts of galaxies today came from, but it seems at least possible that […] merging of black holes left over from the first generation of stars [in the universe] began the process by which supermassive black holes, feeding off the matter surrounding them, formed. […] It seems very unlikely that supermassive black holes formed first and then galaxies grew around them; they must have formed together, in a process sometimes referred to as co-evolution, from the seeds provided by the original black holes of a few hundred solar masses and the raw materials of the dense clouds of baryons in the knots in the filamentary structure. […] About one in a hundred of the galaxies seen at low redshifts are actively involved in the late stages of mergers, but these processes take so little time, compared with the age of the Universe, that the statistics imply that about half of all the galaxies visible nearby are the result of mergers between similarly sized galaxies in the past seven or eight billion years. Disc galaxies like the Milky Way seem themselves to have been built up from smaller sub-units, starting out with the spheroid and adding bits and pieces as time passed. […] there were many more small galaxies when the Universe was young than we see around us today. This is exactly what we would expect if many of the small galaxies have either grown larger through mergers or been swallowed up by larger galaxies.”

Links of interest:

Galaxy (‘featured article’).
Leonard Digges.
Thomas Wright.
William Herschel.
William Parsons.
The Great Debate.
Parallax.
Extinction (astronomy).
Henrietta Swan Leavitt (‘good article’).
Cepheid variable.
Ejnar Hertzsprung. (Before reading this book, I had no idea one of the people behind the famous Hertzsprung–Russell diagram was a Dane. I blame my physics teachers. I was probably told this by one of them, but if the guy in question had been a better teacher, I’d have listened, and I’d have known this.).
Globular cluster (‘featured article’).
Vesto Slipher.
Redshift (‘featured article’).
Refracting telescope/Reflecting telescope.
Disc galaxy.
Edwin Hubble.
Milton Humason.
Doppler effect.
Milky Way.
Orion Arm.
Stellar population.
Sagittarius A*.
Minkowski space.
General relativity (featured).
The Big Bang theory (featured).
Age of the universe.
Malmquist bias.
Type Ia supernova.
Dark energy.
Baryons/leptons.
Cosmic microwave background.
Cold dark matter.
Lambda-CDM model.
Lenticular galaxy.
Active galactic nucleus.
Quasar.
Hubble Ultra-Deep Field.
Stellar evolution.
Velocity dispersion.
Hawking radiation.
Ultimate fate of the universe.

 

February 5, 2017 Posted by | Astronomy, Books, cosmology, Physics | Leave a comment

Diabetes and the Brain (III)

Some quotes from the book below.

Tests that are used in clinical neuropsychology in most cases examine one or more aspects of cognitive domains, which are theoretical constructs in which a multitude of cognitive processes are involved. […] By definition, a subdivision in cognitive domains is arbitrary, and many different classifications exist. […] for a test to be recommended, several criteria must be met. First, a test must have adequate reliability: the test must yield similar outcomes when applied over multiple test sessions, i.e., have good test–retest reliability. […] Furthermore, the interobserver reliability is important, in that the test must have a standardized assessment procedure and is scored in the same manner by different examiners. Second, the test must have adequate validity. Here, different forms of validity are important. Content validity is established by expert raters with respect to item formulation, item selection, etc. Construct validity refers to the underlying theoretical construct that the test is assumed to measure. To assess construct validity, both convergent and divergent validities are important. Convergent validity refers to the amount of agreement between a given test and other tests that measure the same function. In turn, a test with a good divergent validity correlates minimally with tests that measure other cognitive functions. Moreover, predictive validity (or criterion validity) is related to the degree of correlation between the test score and an external criterion, for example, the correlation between a cognitive test and functional status. […] it should be stressed that cognitive tests alone cannot be used as ultimate proof for organic brain damage, but should be used in combination with more direct measures of cerebral abnormalities, such as neuroimaging.”

“Intelligence is a theoretically ill-defined construct. In general, it refers to the ability to think in an abstract manner and solve new problems. Typically, two forms of intelligence are distinguished, crystallized intelligence (academic skills and knowledge that one has acquired during schooling) and fluid intelligence (the ability to solve new problems). Crystallized intelligence is better preserved in patients with brain disease than fluid intelligence (3). […] From a neuropsychological viewpoint, the concept of intelligence as a unitary construct (often referred to as g-factor) does not provide valuable information, since deficits in specific cognitive functions may be averaged out in the total IQ score. Thus, in most neuropsychological studies, intelligence tests are included because of specific subtests that are assumed to measure specific cognitive functions, and the performance profile is analyzed rather than considering the IQ measure as a compound score in isolation.”

“Attention is a concept that in general relates to the selection of relevant information from our environment and the suppression of irrelevant information (selective or “focused” attention), the ability to shift attention between tasks (divided attention), and to maintain a state of alertness to incoming stimuli over longer periods of time (concentration and vigilance). Many different structures in the human brain are involved in attentional processing and, consequently, disorders in attention occur frequently after brain disease or damage (21). […] Speed of information processing is not a localized cognitive function, but depends greatly on the integrity of the cerebral network as a whole, the subcortical white matter and the interhemispheric and intrahemispheric connections. It is one of the cognitive functions that clearly declines with age and it is highly susceptible to brain disease or dysfunction of any kind.”

“The MiniMental State Examination (MMSE) is a screening instrument that has been developed to determine whether older adults have cognitive impairments […] numerous studies have shown that the MMSE has poor sensitivity and specificity, as well as a low-test–retest reliability […] the MMSE has been developed to determine cognitive decline that is typical for Alzheimer’s dementia, but has been found less useful in determining cognitive decline in nondemented patients (44) or in patients with other forms of dementia. This is important since odds ratios for both vascular dementia and Alzheimer’s dementia are increased in diabetes (45). Notwithstanding this increased risk, most patients with diabetes have subtle cognitive deficits (46, 47) that may easily go undetected using gross screening instruments such as the MMSE. For research in diabetes a high sensitivity is thus especially important. […] ceiling effects in test performance often result in a lack of sensitivity. Subtle impairments are easily missed, resulting in a high proportion of false-negative cases […] In general, tests should be cognitively demanding to avoid ceiling effects in patients with mild cognitive dysfunction.[…] sensitive domains such as speed of information processing, (working) memory, attention, and executive function should be examined thoroughly in diabetes patients, whereas other domains such as language, motor function, and perception are less likely to be affected. Intelligence should always be taken into account, and confounding factors such as mood, emotional distress, and coping are crucial for the interpretation of the neuropsychological test results.”

“The life-time risk of any dementia has been estimated to be more than 1 in 5 for women and 1 in 6 for men (2). Worldwide, about 24 million people have dementia, with 4.6 million new cases of dementia every year (3). […] Dementia can be caused by various underlying diseases, the most common of which is Alzheimer’s disease (AD) accounting for roughly 70% of cases in the elderly. The second most common cause of dementia is vascular dementia (VaD), accounting for 16% of cases. Other, less common, causes include dementia with Lewy bodies (DLB) and frontotemporal lobar degeneration (FTLD). […] It is estimated that both the incidence and the prevalence [of AD] double with every 5-year increase in age. Other risk factors for AD include female sex and vascular risk factors, such as diabetes, hypercholesterolaemia and hypertension […] In contrast with AD, progression of cognitive deficits [in VaD] is mostly stepwise and with an acute or subacute onset. […] it is clear that cerebrovascular disease is one of the major causes of cognitive decline. Vascular risk factors such as diabetes mellitus and hypertension have been recognized as risk factors for VaD […] Although pure vascular dementia is rare, cerebrovascular pathology is frequently observed on MRI and in pathological studies of patients clinically diagnosed with AD […] Evidence exists that AD and cerebrovascular pathology act synergistically (60).”

“In type 1 diabetes the annual prevalence of severe hypoglycemia (requiring help for recovery) is 30–40% while the annual incidence varies depending on the duration of diabetes. In insulin-treated type 2 diabetes, the frequency is lower but increases with duration of insulin therapy. […] In normal health, blood glucose is maintained within a very narrow range […] The functioning of the brain is optimal within this range; cognitive function rapidly becomes impaired when the blood glucose falls below 3.0 mmol/l (54 mg/dl) (3). Similarly, but much less dramatically, cognitive function deteriorates when the brain is exposed to high glucose concentrations” (I did not know the latter for certain, but I certainly have had my suspicions for a long time).

“When exogenous insulin is injected into a non-diabetic adult human, peripheral tissues such as skeletal muscle and adipose tissue rapidly take up glucose, while hepatic glucose output is suppressed. This causes blood glucose to fall and triggers a series of counterregulatory events to counteract the actions of insulin; this prevents a progressive decline in blood glucose and subsequently reverses the hypoglycemia. In people with insulin-treated diabetes, many of the homeostatic mechanisms that regulate blood glucose are either absent or deficient. [If you’re looking for more details on these topics, it should perhaps be noted here that Philip Cryer’s book on these topics is very nice and informative]. […] The initial endocrine response to a fall in blood glucose in non-diabetic humans is the suppression of endogenous insulin secretion. This is followed by the secretion of the principal counterregulatory hormones, glucagon and epinephrine (adrenaline) (5). Cortisol and growth hormone also contribute, but have greater importance in promoting recovery during exposure to prolonged hypoglycemia […] Activation of the peripheral sympathetic nervous system and the adrenal glands provokes the release of a copious quantity of catecholamines, epinephrine, and norepinephrine […] Glucagon is secreted from the alpha cells of the pancreatic islets, apparently in response to localized neuroglycopenia and independent of central neural control. […] The large amounts of catecholamines that are secreted in response to hypoglycemia exert other powerful physiological effects that are unrelated to counterregulation. These include major hemodynamic actions with direct effects on the heart and blood pressure. […] regional blood flow changes occur during hypoglycemia that encourages the transport of substrates to the liver for gluconeogenesis and simultaneously of glucose to the brain. Organs that have no role in the response to acute stress, such as the spleen and kidneys, are temporarily under-perfused. The mobilisation and activation of white blood cells are accompanied by hemorheological effects, promoting increased viscosity, coagulation, and fibrinolysis and may influence endothelial function (6). In normal health these acute physiological changes probably exert no harmful effects, but may acquire pathological significance in people with diabetes of long duration.”

“The more complex and attention-demanding cognitive tasks, and those that require speeded responses are more affected by hypoglycemia than simple tasks or those that do not require any time restraint (3). The overall speed of response of the brain in making decisions is slowed, yet for many tasks, accuracy is preserved at the expense of speed (8, 9). Many aspects of mental performance become impaired when blood glucose falls below 3.0 mmol/l […] Recovery of cognitive function does not occur immediately after the blood glucose returns to normal, but in some cognitive domains may be delayed for 60 min or more (3), which is of practical importance to the performance of tasks that require complex cognitive functions, such as driving. […] [the] major changes that occur during hypoglycemia – counterregulatory hormone secretion, symptom generation, and cognitive dysfunction – occur as components of a hierarchy of responses, each being triggered as the blood glucose falls to its glycemic threshold. […] In nondiabetic individuals, the glycemic thresholds are fixed and reproducible (10), but in people with diabetes, these thresholds are dynamic and plastic, and can be modified by external factors such as glycemic control or exposure to preceding (antecedent) hypoglycemia (11). Changes in the glycemic thresholds for the responses to hypoglycemia underlie the effects of the acquired hypoglycemia syndromes that can develop in people with insulin-treated diabetes […] the incidence of severe hypoglycemia in people with insulin-treated type 2 diabetes increases steadily with duration of insulin therapy […], as pancreatic beta-cell failure develops. The under-recognized risk of severe hypoglycemia in insulin-treated type 2 diabetes is of great practical importance as this group is numerically much larger than people with type 1 diabetes and encompasses many older, and some very elderly, people who may be exposed to much greater danger because they often have co-morbidities such as macrovascular disease, osteoporosis, and general frailty.”

“Hypoglycemia occurs when a mismatch develops between the plasma concentrations of glucose and insulin, particularly when the latter is inappropriately high, which is common during the night. Hypoglycemia can result when too much insulin is injected relative to oral intake of carbohydrate or when a meal is missed or delayed after insulin has been administered. Strenuous exercise can precipitate hypoglycemia through accelerated absorption of insulin and depletion of muscle glycogen stores. Alcohol enhances the risk of prolonged hypoglycemia by inhibiting hepatic gluconeogenesis, but the hypoglycemia may be delayed for several hours. Errors of dosage or timing of insulin administration are common, and there are few conditions where the efficacy of the treatment can be influenced by so many extraneous factors. The time–action profiles of different insulins can be modified by factors such as the ambient temperature or the site and depth of injection and the person with diabetes has to constantly try to balance insulin requirement with diet and exercise. It is therefore not surprising that hypoglycemia occurs so frequently. […] The lower the median blood glucose during the day, the greater the frequency
of symptomatic and biochemical hypoglycemia […] Strict glycemic control can […] induce the acquired hypoglycemia syndromes, impaired awareness of hypoglycemia (a major risk factor for severe hypoglycemia), and counterregulatory hormonal deficiencies (which interfere with blood glucose recovery). […] Severe hypoglycemia is more common at the extremes of age – in very young children and in elderly people.
[…] In type 1 diabetes the frequency of severe hypoglycemia increases with duration of diabetes (12), while in type 2 diabetes it is associated with increasing duration of insulin treatment (18). […] Around one quarter of all episodes of severe hypoglycemia result in coma […] In 10% of episodes of severe hypoglycemia affecting people with type 1 diabetes and around 30% of those in people with insulin-treated type 2 diabetes, the assistance of the emergency medical services is required (23). However, most episodes (both mild and severe) are treated in the community, and few people require admission to hospital.”

“Severe hypoglycemia is potentially dangerous and has a significant mortality and morbidity, particularly in older people with insulin-treated diabetes who often have premature macrovascular disease. The hemodynamic effects of autonomic stimulation may provoke acute vascular events such as myocardial ischemia and infarction, cardiac failure, cerebral ischemia, and stroke (6). In clinical practice the cardiovascular and cerebrovascular consequences of hypoglycemia are frequently overlooked because the role of hypoglycemia in precipitating the vascular event is missed. […] The profuse secretion of catecholamines in response to hypoglycemia provokes a fall in plasma potassium and causes electrocardiographic (ECG) changes, which in some individuals may provoke a cardiac arrhythmia […]. A possible mechanism that has been observed with ECG recordings during hypoglycemia is prolongation of the QT interval […]. Hypoglycemia-induced arrhythmias during sleep have been implicated as the cause of the “dead in bed” syndrome that is recognized in young people with type 1 diabetes (40). […] Total cerebral blood flow is increased during acute hypoglycemia while regional blood flow within the brain is altered acutely. Blood flow increases in the frontal cortex, presumably as a protective compensatory mechanism to enhance the supply of available glucose to the most vulnerable part of the brain. These regional vascular changes become permanent in people who are exposed to recurrent severe hypoglycemia and in those with impaired awareness of hypoglycemia, and are then present during normoglycemia (41). This probably represents an adaptive response of the brain to recurrent exposure to neuroglycopenia. However, these permanent hypoglycemia-induced changes in regional cerebral blood flow may encourage localized neuronal ischemia, particularly if the cerebral circulation is already compromised by the development of cerebrovascular disease associated with diabetes. […] Hypoglycemia-induced EEG changes can persist for days or become permanent, particularly after recurrent severe hypoglycemia”.

“In the large British Diabetic Association Cohort Study of people who had developed type 1 diabetes before the age of 30, acute metabolic complications of diabetes were the greatest single cause of excess death under the age of 30; hypoglycemia was the cause of death in 18% of males and 6% of females in the 20–49 age group (47).”

“[The] syndromes of counterregulatory hormonal deficiencies and impaired awareness of hypoglycemia (IAH) develop over a period of years and ultimately affect a substantial proportion of people with type 1 diabetes and a lesser number with insulin-treated type 2 diabetes. They are considered to be components of hypoglycemia-associated autonomic failure (HAAF), through down-regulation of the central mechanisms within the brain that would normally activate glucoregulatory responses to hypoglycemia, including the release of counterregulatory hormones and the generation of warning symptoms (48). […] The glucagon secretory response to hypoglycemia becomes diminished or absent within a few years of the onset of insulin-deficient diabetes. With glucagon deficiency alone, blood glucose recovery from hypoglycemia is not noticeably affected because the secretion of epinephrine maintains counterregulation. However, almost half of those who have type 1 diabetes of 20 years duration have evidence of impairment of both glucagon and epinephrine in response to hypoglycemia (49); this seriously delays blood glucose recovery and allows progression to more severe and prolonged hypoglycemia when exposed to low blood glucose. People with type 1 diabetes who have these combined counterregulatory hormonal deficiencies have a 25-fold higher risk of experiencing severe hypoglycemia if they are subjected to intensive insulin therapy compared with those who have lost their glucagon response but have retained epinephrine secretion […] Impaired awareness is not an “all or none” phenomenon. “Partial” impairment of awareness may develop, with the individual being aware of some episodes of hypoglycemia but not others (53). Alternatively, the intensity or number of symptoms may be reduced, and neuroglycopenic symptoms predominate. […] total absence of any symptoms, albeit subtle, is very uncommon […] IAH affects 20–25% of patients with type 1 diabetes (11, 55) and less than 10% with type 2 diabetes (24), becomes more prevalent with increasing duration of diabetes (12) […], and predisposes the patient to a sixfold higher risk of severe hypoglycemia than people who retain normal awareness (56). When IAH is associated with strict glycemic control during intensive insulin therapy or has followed episodes of recurrent severe hypoglycemia, it may be reversible by relaxing glycemic control or by avoiding further hypoglycemia (11), but in many patients with type 1 diabetes of long duration, it appears to be a permanent defect. […] The modern management of diabetes strives to achieve strict glycemic control using intensive therapy to avoid or minimize the long-term complications of diabetes; this strategy tends to increase the risk of hypoglycemia and promotes development of the acquired hypoglycemia syndromes.”

February 5, 2017 Posted by | Books, Cardiology, Diabetes, Epidemiology, Medicine, Neurology, Psychology | Leave a comment