Tuesday, March 31, 2015

Making The Most Out Of Food

 (Un)Conspicuous Consumption

In an article in The New Yorker, Hannah Goldfield writes about a culinary trend that makes the most use of food, which I think is a good thing. When your parents told you not to waste food, they might have been more progressive than you had then imagined, or at least they had a hard time seeing their hard-earned dollars go so easily to waste.

Goldfield writes in “Waste Not, Want Not, Eat Up?” about such a restaurant in New York City’s West Village:
The other night, as I ate a salad at Blue Hill, in the West Village, a server approached my table with an iPad. “Have you seen this?” she asked. “Chef wanted you to see this.” By “Chef,” she meant Dan Barber, the man behind Blue Hill and Blue Hill Stone Barns, a sister restaurant and farm upstate. By “this,” she meant a photograph of a dumpster, into which a chute was depositing an enormous quantity of multi-colored scraps of fruit and vegetables—the runoff from a commercial food processor. The experience felt something similar to being shown a picture of what would happen to a sad-eyed old horse if you didn’t save it from the glue factory. Sitting in a small, enamel casserole dish in front of me were fruit and vegetable scraps that Barber had rescued, just like the ones in the photo. Arranged in an artful tangle, bits of carrot, apple, and pear were dressed with a creamy green emulsion, studded with pistachios, and garnished with a foamy pouf that turned out to be the liquid from canned chickpeas, whipped into haute cuisine.
You can both enjoy food and reduce waste; this is what we have been doing in our house for years, chiefly out of economic necessity, but also as a philosophy of life to not waste. (I can picture the heads of both my mother and my father nodding in approval.) There is a quiet satisfaction in knowing that you live in this manner, and that your children approve and try to do the same. So much for conspicuous consumption.
For more, go to [NewYorker]

Monday, March 30, 2015

Science Is Often Speculative

Human Thought

I am a firm believer, that without speculation there is no good and original observation.
Charles Darwin
Letter to A. R. Wallace (22 Dec 1857).
In Alfred Russel Wallace and Sir James Marchant (ed.),  
Alfred Russel Wallace: Letters and Reminiscences (1916), 109

[Science] is not perfect. It can be misused. It is only a tool. But it is by far the best tool we have, self-correcting, ongoing, applicable to everything. It has two rules. First: there are no sacred truths; all assumptions must be critically examined; arguments from authority are worthless. Second: whatever is inconsistent with the facts must be discarded or revised. ... The obvious is sometimes false; the unexpected is sometimes true.
Carl Sagan
Cosmos (1985): 277

Carl Sagan [1934-1996] at the Very Large Array in New Mexico. Sagan said in a 1996 interview for NOVA that the existence for extraterrestrial life must go beyond speculation to rigorous proof:
“I personally have been captured by the notion of extraterrestrial life, and especially extraterrestrial intelligence, from childhood. It swept me up, and I've been involved in sending space craft to nearby planets to look for life and in the radio search for extraterrestrial intelligence.    It would be an absolutely transforming event in human history. But, the stakes are so high on whether it's true or false that we must demand the more rigorous standards of evidence—precisely because it's so exciting. That's the circumstance in which our hopes may dominate our skeptical scrutiny of the data. So, we have to be very careful. There have been a few instances in the [past]. We thought we found something, and it always turned out to be explicable.”
Photo Credit: Cosmos/Discovery
: Space.com

The title is correct; science often makes speculative theories of what it yet does not know or fully understand. Speculation is the means of bringing forth new ideas to advance our thinking; speculation is the fount of progress; speculation is the oxygen of human advancement and achievement.

But this is only the beginning of the hard work that can take decades (or longer) for an idea or developed theory to gain acceptance. This “speculation” is one way to bring about the testing of a hypothesis and the debating and discussing of experimental results. It is also a way to advance an idea, inchoate as it might be, to make it more understandable at first in the community of scientists and then, if found valid and true, to the general community. Speculative theories have to be tested in accordance to the standards of science and its scientific method. Speculation alone is insufficient; it must be tested and done so rigorously and without bias.

Speculation is not bad; it is actually good and necessary.  This is how human progress. Speculate. Test. Reason. Debate. Retest. Confirmation. Good and original ideas are the best, but these are always in short supply, and these are always the ones that receive the most arguments against it. This is not to suggest that all original ideas are good. Most are likely not. But it is probably true that in the annals of science, all ideas that are now viewed as great and wonderful were original, and at first viewed with suspicion and scorn by the scientific community. That is, they were not accepted easily in their time.

Galileo Galilei, the 17th century Italian physicist and astronomer, is considered The Father of Modern Science. As Galileo points out: “All truths are easy to understand once they are discovered; the point is to discover them.”

There are many notable examples, including 1) when Nicolaus Copernicus of Poland, in 1514, published Commentariolus (Latin for “Small Commentary”), in which he described the heliocentric planetary system, it was not immediately accepted by scientists and, moreover, its findings incensed Church authorities; 2) when Galileo Galileo of Italy, in 1632, in support of the Copernican theory, published the Dialogue Concerning the Two Chief World Systems, he was convicted by the Catholic Church of heresy and spent the remaining years of his life under house arrest. Galileo is considered as the Father of Modern Science, a title that many scientists today say is well-deserved; 3) when Ignaz Semmelweis of Hungary, in 1847, came up with the model of infection control, chiefly through hand washing, physicians mocked him and his ideas; his book, Etiology, Concept and Prophylaxis of Childbed Fever, published in 1861, was mocked and viewed as irrelevant by many leading scientists at the time, notably by  Rudolf Virchow, considered the leading authority, a few years later, Semmelweis died in ignominy.

Some speculative ideas receive little objection. When Charles Darwin of England, in 1859, with On the Origin of the Species, his ideas were not vociferously attacked, as one would expect, except primarily by those of the religious community, which is the situation today. By 1870, natural selection and evolution were considered true and valid by most of the scientific community and of the general public.And, of course, Albert Einstein of Germany, who, in 1905, published his paper on the special theory of relativity, it became a serious topic of discussion at first within Germany, and then elsewhere. By 1919, it was widely accepted by the scientific community, only 14 years after the initial paper.

For an interesting article on why Einstein's theory was easily adopted, see “Why was Relativity Accepted” in Physics in Perspective  1 (1999) 184 – 21 by Stephen G. Brush. One of the convincing arguments is that it takes someone in a position of authority to first accept the theory and act as its advocate, thus convincing and, perhaps, compelling other scientists to seriously consider its validity. In Einsteins' case, Max Planck and Arthur Eddington were early supporters of the theory.

Brush writes:
Why was relativity accepted? The historical studies reviewed in this paper can be put together to suggest a three-stage answer. In the first stage, a few leading scientists such as Planck and Eddington adopted the theory because it promised to satisfy their desire for a coherent, mathematically sophisticated, fundamental picture of the universe. In the second stage, their enthusiastic advocacy persuaded other scientists to work on the theory and apply it to problems that were currently of great interest: the behavior of electrons, and Bohr’s atomic model. The special theory was accepted by many German physicists by 1910 and had begun to attract some interest in other countries.

In the third stage, the confirmation of Einstein’s light-bending prediction attracted so much attention among the general public as well as among scientists that no one could ignore it after 1919. Physicists who had not previously accepted relativity now had to take it seriously, and when they did, they were persuaded of its validity bya combination of factors.
This might apply to all ground-breaking theories. Such are only a handful of examples of scientists’ speculation that changes the way we think and view the world. Such is the way it is with us humans. We do not easily accept change, even if the ideas are proven true, but will open our minds to the possibility if within the theory there is something that is already agreeable. Science on the whole is skeptical and cautious, which are good qualities.

Then there are the far-out theories, such as the idea of extraterrestrial life. Sagan says in chapter 17 (“The Marriage of Skepticism and Wonder”) of The Demon-Haunted World : Science as a Candle in the Dark (1995) about the human and scientific need to balance openness and wonder with skepticism and caution:
 At the heart of science is an essential balance between two seemingly contradictory attitudes - an openness to new ideas, no matter how bizarre or counterintuitive, and the most ruthlessly skeptical scrutiny of all ideas, old and new. This is how deep truths are winnowed from deep nonsense. The collective enterprise of creative thinking and skeptical thinking, working together, keeps the field on track. Those two seemingly contradictory attitudes are, though, in some tension.
It is this through the working out of this tension that great discoveries are made. 

Saturday, March 28, 2015

The Rare Gobi Bears Of Mongolia

Species Survival

A Female Gobi Bear, Chadwick of National Geographic says,“warily eyes the scientists who
minutes before immobilized her, checked her physical condition, and attached a GPS radio collar
and ear tag—all in hopes of improving her chances of survival.”
Photo Credit:
Joe Riis
Source: National Geographic

An article, by Douglas Chadwick, in National Geographic looks at the Gobi bear (Ursus arctos gobiensis), known in Mongolian as the mazaalai, which is somehow surviving in one of the harshest climates in the world: the Gobi Desert in Mongolia.

Chadwick, a wildlife biologist, writes:
The Gobi is Earth’s fifth largest desert, sprawling across half a million square miles of southern Mongolia and northern China. It sees temperatures of minus 40°F in winter and 120 in summer, and gets just two to eight inches of annual rainfall. Some years parts of the region receive no rain at all. Windstorms sweep through day and night, with gusts strong enough to send a tent sailing away over the horizon. When winds are calm, the Gobi’s immense silence can feel as overwhelming as the heat.

Signs of life come as a surprise in this sun-blasted, wind-scoured landscape. Peering through binoculars, I at first see just barren rock rising in ranks of mountains. The only things that move are dust devils and the shimmering heat.

The Gobi’s stark landscape appears devoid of life, but its wildlife community is surprisingly rich. Slowly, as I discover where to look, animal forms emerge: A lizard rests in the thin shade of a saxaul shrub. A saker falcon lifts off from a distant cliffside. Gerbils poke their heads from burrows.

But many days pass before I finally lay eyes on the animal I crossed half a world to see: a Gobi bear, among the rarest and least known large mammals on Earth. There are perhaps no more than two or three dozen left in the wild, and none live in captivity anywhere.

This male stops at an oasis to sip water, then rests nearby. Elated by our good luck and mesmerized by the sight, my companions and I watch the bear for two hours, from late afternoon to nightfall. Most bears become active toward day’s end, but this one remains oddly still. When he finally attempts to walk, his gait seems pained and slow. He must have traveled a great distance to reach water, I tell myself, and the journey might have left him exhausted and temporarily lame.

In reality, the bear is dying. A week later a ranger finds his body near the same oasis. The old male had likely emerged from hibernation in poor condition at a time when food plants were just starting to grow.
The Gobi bear, a subspecies of the brown bear, are highly endangered of extinction. There are fewer than three dozen Gobi bears left, making these bears among the world’s rarest bears and animals. Yet, the bears survive, eating what is necessary to do so. This is a story of survival, aided to a large degree by Mongolia's laws protecting the bear; the article notes: “One positive legacy of the Soviet era is the Great Gobi Strictly Protected Area (GGSPA), a sprawling nature preserve established in 1976 and declared a Biosphere Reserve by UNESCO in 1990. Today the reserve is the Gobi bear’s sole refuge. Access is allowed only by permission.”For now, this is necessary, allowing nature to take its course. Nature, as we observe, is unsentimental; it gives and it takes without emotion.

For more, go to [NatGeo]

Tuesday, March 24, 2015

The Cancer Documentary

Cancer Research

Ken Burns is a master of the documentary; I am looking forward to watching this one. It is based on the book of the same title; I have yet to read the book, although it is on my list of books to purchase. The documentary runs for three nights, between March 30 and April 1, six hours in total, on PBS

The film’s producers describe the documentary as follows:
Cancer: The Emperor of All Maladies is a three-part, six-hour major television event on PBS presented by documentary filmmaker Ken Burns, in partnership with WETA, the flagship public broadcasting station in Washington, D.C. Based on the 2010 Pulitzer Prize-winning book The Emperor of All Maladies: A Biography of Cancer by Siddhartha Mukherjee, the series is the most comprehensive documentary on a single disease ever made. This “biography” of cancer covers its first documented appearances thousands of years ago through the epic battles in the 20th century to cure, control and conquer it, to a radical new understanding of its essence. The series also features the current status of cancer knowledge and treatment —the dawn of an era in which cancer may become a chronic or curable illness rather than its historic death sentence in some forms.
Having read and kept current on the latest research on cancer, and having posted some of the research currently underway, this statement is true and accurate. We have entered the golden age of cancer research and treatment, but we must continue and advance forward, since the disease is horrible and relentless.

For more, go to [CancerFilms]

Monday, March 23, 2015

America's Policies On Iran Helped Elect Netanyahu

The Iran File

The Israeli elections are over; and it appears that Benjamin Netanyahu will cobble together a
right-leaning religious coalition to once again lead Israel as its prime minister—his fourth term. It is all about security. The issue of security once again triumphed over the socio-economic issues of housing, employment and poverty (so-called kitchen table concerns).  Two nations, one close and one farther away, played a part in this election; one is an ally, and one is not. The people of Israel naturally want peace and security, and it might be that they view Netanyahu as its rightful and only protector. It also might be that America helped elect Netanyahu over the issue of Iran's nuclear ambitions, which is what Prof. George Jochnowitz puts forth as his argument: “Obama made some Israelis decide that a tough stand was required. The Israelis who decided at the last minute to vote for Netanyahu did so because of Obama’s lack of flexibility.”

by George Jochnowitz

In February, House Speaker John Boehner said he had invited Pope Francis to address a joint meeting of Congress. The Pope is scheduled to appear on September 24.

This invitation is a gross breach of protocol. Nevertheless, nobody much noticed. There have been no condemnations of this breach of etiquette. In all likelihood, President Obama will meet with the Pope when he is in the United States.

The contrast with the reaction to Boehner’s invitation to Israel’s Prime Minister Netanyahu is striking. Vice President Joe Biden did not attend Netanyahu’s speech to Congress, even though the Vice President is also President of the Senate. Eight Senators and 50 House Representatives also did not attend.

All of them would have attended had President Obama agreed to meet, however briefly, with Netanyahu. It would have been the polite thing to do, even though the invitation itself was a breach of protocol. Netanyahu and Obama could have exchanged any new information they might have had about Iran’s nuclear plans, even though they probably couldn’t have changed any views.

Instead, the reaction everywhere was that the breach of protocol was an absolute outrage. One of the factors leading to this view was that Netanyahu was running for reelection, and that he was going to Congress to help get votes for his party. It is indeed possible that one of the reasons Netanyahu accepted the invitation was that he felt it might help him win. I can’t read his mind—or anyone else’s—but I believe that the big issue motivating him was his fear of Iran’s increasing nuclear capabilities.

Obama certainly must know that in 2001, Iran’s moderate President Rafsanjani said a nuclear attack against Israel “will leave nothing on the ground, whereas it will only damage the world of Islam.”

Israel has every reason to fear Iran’s nuclear ambitions. That is what Netanyahu spoke about in his address to Congress. President Obama should understand—and share—Netanyahu’s fear. In his speech, Netanyahu said, “We appreciate all that President Obama has done for Israel.”

Nowhere did Netanyahu say anything that might have offended Obama. Obama, unfortunately, had already been offended, and Netanyahu’s words couldn’t change that. Obama did not offer any conciliatory gesture after the speech. Netanyahu went back to Israel knowing that the President of Israel’s chief ally in the world was rigidly opposed to both Netanyahu and to his desire to control Iran’s attempts at getting the bomb.

Iran is an oil-rich country. Does it make sense for Iran to have provoked many countries to impose sanctions against it? Canada, Australia, England, France and Japan have joined in imposing restrictions against trade with Iran.

Iran doesn’t care. Destroying Israel is a more important goal. President Obama shows no understanding of the problem. That freed Netanyahu to make a statement saying he was changing his openness to creating a Palestinian state.

Israeli politics is very complicated. It became simpler when President Obama’s rigidity convinced many Israelis that nothing could be gained by working with the United States to bring about a peaceful withdrawal from disputed territories. Obama made some Israelis decide that a tough stand was required. The Israelis who decided at the last minute to vote for Netanyahu did so because of Obama’s lack of flexibility.

George Jochnowitz was born in New York City, in 1937. He became aware of different regional pronunciations when he was six, and he could consciously switch accents as a child. He got his Ph.D. in linguistics from Columbia University and taught linguistics at the College of Staten Island, CUNY. His area of specialization was Jewish languages, in particular, Judeo-Italian dialects. As part of a faculty-exchange agreement with Hebei University in Baoding, China, he was in China during the Tiananmen Massacre. He can be reached at george@jochnowitz.net.

Copyright ©2015. George Jochnowitz. All Rights Reserved. A version of this article originally appeared in the algemeiner (March 19, 2015). The article is republished here with the author’s permission.

Sunday, March 22, 2015

The Age Of The Automatons

Intelligent Machines

Intelligent Machines “Artwork for the cover of a 1959 issue of the French science fiction magazine Galaxie.”
Photo Credit
: CCI/Art Archive/Art Resource

In a book review article (“How Robots & Algorithms Are Taking Over,”; April 2, 2015) in The New York Review of Books, Sue Halpern revisits the idea that automatons (or robots) are not only displacing workers—they are—but also that they pose some  threat to humanity—they might—if left unchecked and start developing intelligence greater than ours. This idea resurfaces every generation or so, notably when the economy tanks, as it did in 2008.

In citing Nicholas Carr's The Glass Cage: Automation and Us, Halpern writes that job losses are almost certain to take place, notably in fields where intelligent machines can perform the tasks better, that is, with greater speed and with less mistakes.
In September 2013, about a year before Nicholas Carr published The Glass Cage: Automation and Us, his chastening meditation on the human future, a pair of Oxford researchers issued a report predicting that nearly half of all jobs in the United States could be lost to machines within the next twenty years. The researchers, Carl Benedikt Frey and Michael Osborne, looked at seven hundred kinds of work and found that of those occupations, among the most susceptible to automation were loan officers, receptionists, paralegals, store clerks, taxi drivers, and security guards. Even computer programmers, the people writing the algorithms that are taking on these tasks, will not be immune. By Frey and Osborne’s calculations, there is about a 50 percent chance that programming, too, will be outsourced to machines within the next two decades.
In fact, this is already happening, in part because programmers increasingly rely on “self-correcting” code—that is, code that debugs and rewrites itself*—and in part because they are creating machines that are able to learn on the job. While these machines cannot think, per se, they can process phenomenal amounts of data with ever-increasing speed and use what they have learned to perform such functions as medical diagnosis, navigation, and translation, among many others. Add to these self-repairing robots that are able to negotiate hostile environments like radioactive power plants and collapsed mines and then fix themselves without human intercession when the need arises. The most recent iteration of these robots has been designed by the robots themselves, suggesting that in the future even roboticists may find themselves out of work.
Another concern is that automation, including the human use of Google search engines, dulls the brain, an effect that is likely true at least as it applies to how we think about finding information. Another way of looking at information retrieval is that if it is now easier and faster, our brains can be used for other, perhaps more important, matters. This is a positive change.

Yet, doubt about our automated future persists, and has for some time. The book to read is Norbert Wiener's The Human Use of Human Beings; although it was published in 1950, it is still relevant today [see my post here]. The fear of unbounded and amoral technology has a long history in literature; Frankenstein's monster is itself a modern rendering of the myth of Prometheus.

If even half of this takes place within the next 20 years, what we have before us is a bleak dark and dystopian future of humanity, where many individuals will not only be unemployed, but where machines will almost make humans superfluous, unnecessary in what many predict will be a fully formed and functioning machine age. This is the kind of thinking that has supplied and supported many a science-fiction novel. There are variations of this theme, including machines revolting against their human masters and makers, humans revolting against their machine overlords, humans united with a few courageous machines to bring freedom to the world, and machines dominating humans as we have dominated our planet.

It is true that machines will replace humans in jobs that they can do better; it is the nature of technology to do this, especially when allied with commercial interests to make money and profit from it; what will happen to so many displaced workers is hard to predict now. It is possible that there will be new industries dedicated to robotic age, the age of automatons. It is disheartening to many to see machines, no matter how intelligent, more intelligent than us. These are valid concerns.

On the flip side, there are social robots, the article notes:
What is a social robot? In the words of John Markoff of The New York Times, “it’s a robot with a little humanity.” It will tell your child bedtime stories, order takeout when you don’t feel like cooking, know you prefer Coke over Pepsi, and snap photos of important life events so you don’t have to step out of the picture.
When I mentioned this scenario to my 13-year-old son, his reaction was positive, excited. “Great, I can sit in front of my screen at work and the robot will order me pizza and coke.“ He believes and thinks that he will have a job. I think he will, but it might be in a completely different field that we have yet to envision. The stuff of sci-fi? Perhaps.

A more important question is why we humans see machines as a threat to our autonomy. If we do see intelligent machines as a threat, can it be that we are imposing our views of the world on the machine, in effect, giving human qualities to machines? I hope not, because I want to believe that intelligent, rational ”beings” will progress and learn from all the mistakes that humans have made. Is it possible that we can learn from such intelligent beings? I think so. I do not see machines as a threat, no matter how intelligent they become, but as an benefit to us.

For more, go to [NYRB]

Saturday, March 21, 2015

Alzheimer's Disease Research Focuses on Microglial Cells

The Human Brain

Alzheimer's Disease: “Publishing in Science Translational Medicine, the team describes
the technique as using a particular type of ultrasound called a focused therapeutic ultrasound,
which non-invasively beams sound waves into the brain tissue. By oscillating super-fast, these
sound waves are able to gently open up the blood-brain barrier, which is a layer that protects
the brain against bacteria, and stimulate the brain’s microglial cells to move in. Microglila cells
are basically waste-removal cells, so once they get past the blood-brain barrier, they’re able to
clear out the toxic beta-amyloid clumps before the blood-brain barrier is restored within a
few hours.

Photo Credit: Image: 3Dme Creative Studio / Shutterstock.com
Source: ScienceAlert

An article in ScienceAlert looks at one of the many research studies currently underway to reverse the damage caused by Alzheimer’s disease, a degenerative disease of the brain that affects 50 million individuals worldwide. In Canada, about 750,000 individuals have been diagnosed with a cognitive impairment, and 5 million in the United States.

As the population ages, the numbers are expected to rise significantly within the next two or three decades, possibly doubling or even tripling. This explains the focused scientific research to alleviate or reverse its damage to human cognition.

In the study, a team from the University of Queensland in Australia has used ultrasound waves—a non-invasive method— to help improve memory by clearing up the plaques that build up in the brain. So far, the study has been conducted on mice, where 75 percent of the mice got their memories back.

In “New Alzheimer’s treatment fully restores memory function,” (March 18, 2015), the article says:
Australian researchers have come up with a non-invasive ultrasound technology that clears the brain of neurotoxic amyloid plaques - structures that are responsible for memory loss and a decline in cognitive function in Alzheimer’s patients.

If a person has Alzheimer’s disease, it’s usually the result of a build-up of two types of lesions - amyloid plaques, and neurofibrillary tangles. Amyloid plaques sit between the neurons and end up as dense clusters of beta-amyloid molecules, a sticky type of protein that clumps together and forms plaques.

Neurofibrillary tangles are found inside the neurons of the brain, and they’re caused by defective tau proteins that clump up into a thick, insoluble mass. This causes tiny filaments called microtubules to get all twisted, which disrupts the transportation of essential materials such as nutrients and organelles along them, just like when you twist up the vacuum cleaner tube.

As we don’t have any kind of vaccine or preventative measure for Alzheimer’s - a disease that affects 343,000 people in Australia, and 50 million worldwide - it’s been a race to figure out how best to treat it, starting with how to clear the build-up of defective beta-amyloid and tau proteins from a patient’s brain.
The key to solving the Alzheimer puzzle rests on allowing the brain“s waste-removal team, so to speak, do its job. These are called microglial cells, which Wikipedia says are “the resident macrophages of the brain and spinal cord, and thus act as the first and main form of active immune defense in the central nervous system (CNS).” 

So far this is good news for the mice; human trials are expected to begin in 2017. In another study looking at microglial cells, this one at Stanford University’s School of Medicine, in the U.S., researchers focused on blocking a receptor to improve its ability to do its job. The article, by Bruce Goldman,  is posted on Stanford Medical News.

In “Blocking receptor in brain’s immune cells counters Alzheimer’s in mice, study finds," (December 8, 2014), Goldman writes:

The researchers found that, in mice, blocking the action of a single molecule on the surface of microglia restored the cells’ ability to get the job done — and reversed memory loss and myriad other Alzheimer’s-like features in the animals.
The study, published online Dec. 8 in The Journal of Clinical Investigation, illustrates the importance of microglia and could lead to new ways of warding off the onset of Alzheimer’s disease, which is predicted to afflict 15 million people [in the U.S. ] by mid-century unless some form of cure or prevention is found. The study also may help explain an intriguing association between aspirin and reduced rates of Alzheimer’s.
Microglia, which constitute about 10-15 percent of all the cells in the brain, actually resemble immune cells considerably more than they do nerve cells.“Microglia are the brain’s beat cops,” said Katrin Andreasson, MD, professor of neurology and neurological sciences and the study’s senior author. “Our experiments show that keeping them on the right track counters memory loss and preserves healthy brain physiology.”
A microglial cell serves as a front-line sentry, monitoring its surroundings for suspicious activities and materials by probing its local environment. If it spots trouble, it releases substances that recruit other microglia to the scene, said Andreasson. Microglia are tough cops, protecting the brain against invading bacteria and viruses by gobbling them up. They are adept at calming things down, too, clamping down on inflammation if it gets out of hand. They also work as garbage collectors, chewing up dead cells and molecular debris strewn among living cells — including clusters of a protein called A-beta, notorious for aggregating into gummy deposits called Alzheimer’s plaques, the disease’s hallmark anatomical feature.
Perhaps, in the next few years, one or both methods will help reverse the terrible effects of Alzheimer’s disease. The key seems to be on the microglial cell, or at least this is the current thinking and research focus. Human cognition, our memories and our thought processes, and our abilities to make sense of our surroundings, are as much part of us and define who we are; without these, we feel lost, incomplete, angry. We wish the researchers good luck in their endeavors; this work is important, life-altering..

For more, go to [ScienceAlert] and  [Stanford]

Friday, March 20, 2015

Stravinsky's Rite of Spring

In this 2013 BBC Proms performance, Les Siècles Orchestra perform Stravinsky's Rite of Spring (“Le sacre du printemps”), with  François-Xavier Roth at the podium. The orchestra, from France, was formed in the summer of 2003 by François-Xavier Roth.

As its website says:
With a vast period-instrument collection at its disposal, spanning the baroque, classical, romantic and modern eras, the orchestra’s repertoire is notably wide in range. Les Siècles is one of a small number of ensembles to employ period and modern instruments, playing each repertoire on appropriate instruments. Its flexible and historically informed work delivers a unique strand of creative programming.
The piece premiered at Paris’s Théâtre des Champs-Élysées on May 29, 1913; it caused quite a lot of controversy, Wikipedia notes:
It was written for the 1913 Paris season of Sergei Diaghilev's Ballets Russes company; the original choreography was by Vaslav Nijinsky, with stage designs and costumes by Nicholas Roerich. When first performed, at the Théâtre des Champs-Élysées on 29 May 1913, the avant-garde nature of the music and choreography caused a sensation and a near-riot in the audience. Although designed as a work for the stage, with specific passages accompanying characters and action, the music achieved equal if not greater recognition as a concert piece, and is widely considered to be one of the most influential musical works of the 20th century.
Spring officially begins in the northern hemisphere with the March or vernal equinox at 22;45 UTC, or here in Toronto at 18:45, or 6:45 p.m. EDT. An enjoyable, meaningful and green spring to one and all.

Wednesday, March 18, 2015

The Creative Life: Often Unpleasant, But Meaningful

Human Meaning

Creativity & Emotions: Creative individuals are free to display the full range of emotions, Scott Barry Kaufman writes, and are not restricted by societal norms of what is appropriate or acceptable: “But perhaps most tellingly, the researchers found that creativity was more strongly related to the sum of positive and negative emotions than measures of positive or negative emotions alone. This suggests that the capacity to experience intense emotions– both positive and negative– may be central to the Creative Life.”
Photo Credit & Source
: Scientific American

An article, by , in Scientific American explores the relationship between creativity and well-being, suggesting that the creative life is not necessarily happy, but it is meaningful and leads to greater well-being than a life devoted to the pursuit of pleasure and happiness.

Kaufman, scientific director of the Imagination Institute in the Positive Psychology Center at the University of Pennsylvania, writes about some of the expectations common to a life devoted to creative expression:
While the Creative Life is not directly associated with traditional conceptualizations of happiness, the Creative Life appears to be associated with a more deeply meaningful life. In his book “Authentic Happiness“, Martin Seligman distinguishes between the “Pleasant Life” and the “Meaningful Life”. The Pleasant Life is what people tend to think of when they think of happiness: a life full of positive emotions and joy, and lacking challenge or struggle. The Pleasant Life is mainly about getting what you want and need. It is associated with feeling good in the moment, and being a taker more than a giver. In contrast, the Meaningful Life is linked to self-expression, and doing positive things for others. Certainly, there are factors that contribute to both the Pleasant Life and the Meaningful Life– including feeling connected to others, feeling productive, and not being alone or bored– but there are also some key differences between living a pleasant and meaningful life.

The Meaningful Life is associated with increased stress and anxiety, but it is also linked to greater integration of the past, present, and future, resiliency, and the
ability to cope with life’s inevitable difficulties. After all, as the Buddhists have long noted, every life has its 10,000 joys and 10,000 sorrows. “Humans may resemble many other creatures in their striving for happiness, but the quest for meaning is a key part of what makes us human, and uniquely so,” notes lead author Roy Baumeister.

The deep connection between creativity and meaning was noted long ago by the great creativity researcher Frank X. Barron. Through his pioneering research on some of the most creative people of his generation,
Barron came to realize that creative people have the remarkable capacity to become intimate with themselves. According to psychologist Ruth Richards, they “dare to look within, even at one’s irrational and less conscious material, including one’s ‘shadow’ materials”. Richards refers to this capacity as “courageous openness”.
This used to be referred to as having a well-rounded character, and reading literature gives a greater understanding of such an idea, to wit, an indication of a life well-read and well-lived. And, I would like to add, self-understood. A deep understanding of the self is not at all the same as superficial view of one’s wants and needs; the latter is narcissistic, while the former is giving. It is also true that the majority of creative persons do not view the accumulation of wealth or money as their primary goal or purpose—not that money is unimportant; it is—but not as important as the other things that dominate their thoughts. In short, a beautiful mind.

These individuals, rare as they are, can be found in all areas of life, not only in the arts and humanities, where they are more dominant, but also in the sciences, in music, in religion, in teaching and in business. Sometimes, they go unrecognized, but they continue their creative pursuits as a means of providing intrinsic pleasure.

Creative persons, in search for the older truths of beauty, meaning and justice do not have the single-minded determination to accumulate wealth that marks the persons who successfully achieve such purposes and priorities. Such individuals are happy, as many studies show, but it is a situational happiness and not one emanating from an integrated individual. Moreover, the pursuers of wealth are unlikely burdened by the thoughts of meaning that penetrate and circulate within the minds of creative individuals. As for creative individuals and the pursuers of power, the historical record is thin on the number of such persons.

For more, go to [ScienAmer].

Monday, March 16, 2015

Reading Literature Any Old Way You Like

Academic Research

A Few Of My Books. This represents one of my many bookshelves that line the walls of our two-bedroom apartment; books of science co-exist harmoniously with books of literature, from Dickens to Dawkins, from Pushkin to Pinker—I would think and expect that many would agree that knowledge from both arts and science is worth reading and considering, and, perhaps, worth enjoying.
Photo Credit & Source: (c) Perry J. Greenbaum, 2015

A number of years ago, a friend remarked that he no longer had any love to read literature, having lost this desire by having done a graduate degree in English literature. I was then considering a graduate degree in English literature, and found his response disheartening. I did not pursue this idea, and I am glad that I have not; and it is likely that I never will, given how literature departments view and teach literature.

Not only literature it must be added, but the whole academic branch of humanities has for the last number of decades viewed the text , music and other modes of communication as a rich source to mine and interpret in accordance with a particular political and socio-economic view of the world. When one talks about humanities one is generally referring to the study of how humans interact and influence culture, whether this is through language, religion, writing, painting, history, philosophy or music.

In an age where relevance and practicality reign supreme, at a time where young minds want to better their chances at a good well-paying career, and at a time where many PhDs in English cannot find a tenured position, it would seem ill-advised to sign up for a degree in literature, philosophy or creative arts. That humanities departments feel under attack is not new; such has been the case for more than a century, since the late 1880s, when science gained importance and prominence with each new discovery and advancement that bettered the human condition and generated excitement.

In From Dawn to Decadence: 500 Years of Western Cultural Life (2000), Jacques Barzun, a professor of history at Columbia University, writes:
It would be wrong to suppose that the scientists went out of their way to maim or kill the humanists. The latter's wounds were self-inflicted. In the hope of of rivaling science, of becoming sciences, the humanities gave up their birthright. By teaching college students the methods of minute scholarship, they denatured the contents and obscured the virtues of liberal studies.
“Research” was the deceptive word that made humanists devote their efforts exclusively to digging out facts about their subjects without ever getting back into it. (606–7)
The “research” is influenced by literary theories. In the reading of literature today there are the numerous literary theories, the three major branches are Marxist, Feminism and Postmodernism, with each school having sub-branches like post-colonial studies, gender studies and cultural studies. The working argument is that such theories are necessary to understand the literary work, or work of literature. Thus, theorists devoted to a particular school of literary thought apply and debate with passion and zeal, as if the literature ought to be viewed like a science, but debated like a religion or ideology. I can understand why such school of thought were organized, but I cannot necessarily see their importance.

Take Flaubert’s Madame Bovary, a book that I read some years ago and enjoyed. Applying Feminist theory to its protagonist will give a different understanding of Emma Bovary’s actions than applying Marxist theory. And if you bring in gender and cultural studies, her actions, considered scandalous and morally wrong by respectable persons at the time the book was published in 1856, would today be viewed by some as necessary and courageous, given her “imprisonment” as a wife of a provincial doctor and her need for excitement as a counterweight to the boredom of life.

Think of what can be done to another of my favourites, Doestoevsky’s Crime and Punishment, published in 1866. Instead of viewing Raskolnikov as the cold-blooded murderer that he was, if one applies for example, Marxist theory, it is easy to be sympathetic to Raskolnikov as an impoverished student who needed money. That he murdered two persons, Alyona Ivanovna (“a Jew” and “an awful old harpy”) and her half-sister by a different mother, Lizaveta Ivanovna ( “a good-natured face and eyes”), are not as important as the idea that his “need for money” is itself a particular crime dictated by social and economic inequalities.

Is murder, theft and adultery wrong? I think so; these are not victimless crimes. It is easy to see how applying current literary theories into older, classical works, can be problematic, the ideas emanating from such close readings and analysis can lead to a far different understanding of what the author originally intended and what society then considered important.

But we have gone deeper into the rabbit hole. Literary critics often say it is irrelevant what the author originally thought or wrote, on what is called authorial intent; the work is a living document and can be read in light of modern ideas and theories and where individual readers can and ought to form their own opinions about a particular work. This idea was put forth by Roland Barthes, a French literary critic, in his 1967 essay “The Death of the Author.” He and Michel Foucault were influential in how literature was studied, leading to the formation of what is called the poststructuralist movement.

Many others followed, leading to incomprehensible and nonsensical language. The aim, it seems, is not to communicate. The book to read is Fashionable Nonsense: Postmodern Intellectuals’ Abuse of Science (1997) by Alan Sokal and Jean Bricmont, both whom are physicists.

With so many theories—many political and without reason—applied to the “close reading” of literature, what can initially start off as a love affair can quickly sour. Literature and the reading of it no longer is done for enjoyment or to educate the mind in a broad way, but to turn the mind to a narrow political bent that matches that of the professor. Small wonder, then, that the humanities in general is suffering and considered irrelevant by scientists. The blame lies in the people who have lead such programs into a sea of disconnect with today’s modern civilization.

In a much-discussed and -debated article (“Science Is Not Your Enemy; August 6, 2013) in The New Republic, Steven Pinker writes:
Diagnoses of the malaise of the humanities rightly point to anti-intellectual trends in our culture and to the commercialization of our universities. But an honest appraisal would have to acknowledge that some of the damage is self-inflicted. The humanities have yet to recover from the disaster of postmodernism, with its defiant obscurantism, dogmatic relativism, and suffocating political correctness. And they have failed to define a progressive agenda. Several university presidents and provosts have lamented to me that when a scientist comes into their office, it’s to announce some exciting new research opportunity and demand the resources to pursue it. When a humanities scholar drops by, it’s to plead for respect for the way things have always been done.
The humanities as a group has tried to become, in the last fifty years, what it can never become: a serious scientific discipline without taking any notice of the facts of science and the role that science and in particular applied science play in today’s modern society. So, it has entered and remained in the realm of the absurd and nonsensical, not to be taken seriously. It has failed, and miserably so, I must add, with sadness. Not only the failure, but the waste of time and resources and the indoctrination of students, who become professors, and repeat the same mistakes in thinking, or not thinking clearly. That many adopt Leftist thinking as their political view is not surprising; it is as if they have no choice if they want to be taken seriously by their literary peers in the academy.

It’s disheartening to view such a sight from otherwise fairly intelligent people, but who often lack the understanding of science and the scientific method. Literature ought to be enjoyable, but it can only be so when free from the weight and burden of unnecessary restrictions; humanities departments have become what they should have never become: oppressive restrictive places of inquiry. It has been ruled by, as Gary Saul Morson, says in a New Criterion piece, “the tyranny of theory.”

I agree that reading and understanding literature can act as moral guide, and that a classical liberal arts education can be a balance to a society devoted to science and technology. As much as I respect science and see its need, society needs the humanities. I am not so sure, however, that today’s literary theories help this cause.It seems that, for the most part, they only add to the chaos and confusion, and do not. It is my informed view that literature has an important place in modern society, notably in one influenced and devoted to technology. The best literature, including the Russian masterpieces like Anna Karenina, Crime and Punishment and Fathers and Sons act as thought pieces on the moral human condition. That is, they provide important insights into the universal rules that guide us as moral human beings.

There might be some common ground, in that both science and the humanities, in the best of cases, are searching for some truth. Pinker in the same article noted above comes up with a suggestion for the humanities that can save it from itself, given the direction it has taken: join forces and work together with science—a consilience of knowledge— instead of considering science the enemy.
Those ways do deserve respect, and there can be no replacement for the varieties of close reading, thick description, and deep immersion that erudite scholars can apply to individual works. But must these be the only paths to understanding? A consilience with science offers the humanities countless possibilities for innovation in understanding. Art, culture, and society are products of human brains. They originate in our faculties of perception, thought, and emotion, and they cumulate and spread through the epidemiological dynamics by which one person affects others. Shouldn’t we be curious to understand these connections? Both sides would win. The humanities would enjoy more of the explanatory depth of the sciences, to say nothing of the kind of a progressive agenda that appeals to deans and donors. The sciences could challenge their theories with the natural experiments and ecologically valid phenomena that have been so richly characterized by humanists.
I do not completely agree with Pinker, since I consider the classical old-fashion view of humanities sufficient in its own right, the one that said reading ought to be enjoyable, that the text had something important to say way back then, and that we ought to understand original authorial intent. This does not suggest that we ought to agree with it, but we ought to at least understand the text, the story, the narrative, and the tenor of the times before we criticize the sum total of these or dismiss these outright. We ought to also guard ourselves about chronological snobbery, the easy and quick dismissal of ideas from the past.

Not that this old-school view is given much weight today. Even so, I doubt that this change in thinking that Pinker puts forth will occur anytime soon in most university humanities departments. The humanities will be burdened by the weight of critical theories, a weight that increases it seems yearly.

I still enjoy reading and discussing ideas. But I belong to no particular school of thought, nor do I see a need to be locked in to a narrow and fixed regime of ideas. This marks me as a contrarian, a label I wear well. So, until then, I will read my books in quiet solitude, and form my own opinions on why I enjoy them.

For those interested in this interesting and important debate, there has been a rebuttal by Leon Wieseltier  (“Crimes Against Humanities”; September 3, 2013) and added commentary by both; see [New Republic].

Sunday, March 15, 2015

Red Meat & Colorectal Cancer

Human Diet

The Pescatarian Diet: “Pescetarians, as they are commonly referred, had a 43% lower chance of getting the cancer compared to people with omnivorous diets.”
Photo Credit
: Shuttlestock
Source: CNN
An article, by Joseph Netto, in CNN says that vegetarians who eat fish—pescetarians—have less incidences of colorectal cancer than those who are omnivores and even have less risk than those who are vegetarians. That pescatarians fared better than vegetarians suggests that the consumption of fish is a key factor in reducing the risk of colorectal cancer, and, moreover, that consumption of red meat increases the risk. 

In “Vegetarians who eat fish could be greatly reducing their risk of colon cancer,” Netto writes:
While evidence shows the health benefits of reducing red meat consumption, the recent study highlights the differences between even a fully vegetarian diet and a pescetarian diet. Within the sample group there was a 27% drop in the risk of contracting colorectal cancer if you switch from fully vegetarian to eating fish. The authors of the study suggest that omega-3 fatty acids may be the key to such a low risk of cancer in the pescetarian group.

Nutritionist Lisa Drayer agrees. "In addition to other dietary factors, fish may provide added protection from its high content omega-3 fatty acids. This is consistent with previous research that has found omega-3s have anti-cancer activity and that they may be helpful in the prevention and treatment of colorectal cancer."
This article cites a growing body of scientific studies (the latest in JAMA Internal Medicine) that show the correlation between consumption of red meat and colon cancer. I have changed my diet in the last year and now rarely eat red meat (I find it harder to digest post-op, post-chemo. ); I do eat chicken and lots more fish and fruits and vegetables, I not only feel better, I look better. 

For more, go to [CNN]

Saturday, March 14, 2015

A Butterfly's Colourful Camouflage Helps It Avoid Hungry Predators

The Natural World

Paper Kite Butterflies: “The sheen of these gold chrysalides offers a shield of camouflage
for paper kite butterflies growing inside them.”
Photo Credit: Michael Weber, imageBroker/Corbis
: NatGeo

An article, by Liz Langley, in National Geographic examines why butterflies have such striking colours; take the paper kite butterfly (Idea leuconoe) as a fine example.

Langley writes:
The paper kite butterfly, native to Asia, is light yellow or off-white with an elaborate pattern of swooping black lines and dots. But its chrysalis—a hard case that protects the caterpillar during its final transformation into a butterfly—is a shiny, golden hue.It’s unknown why the chrysalis itself is gold, but its shininess helps camouflage the developing butterfly, says Katy Prudic, a biologist at Oregon State University in Corvallis.

In particular, the sheen is “disruptive” to potential predators—it makes the chrysalis “hard to detect in a complicated background,” Prudic says. A hungry bird may even think it looks like a drop of water.
Camouflage is crucial to chrysalides: Because growing butterflies are unable to move and in danger of being eaten or parasitized, “they're a sitting duck,” Prudic notes.

The giant swallowtail is another example of chrysalis camo. In that species, the chrysalis resembles part of the tree on which it hangs—or it looks a bit snakelike, depending on the vantage point. (Watch video: Growing Up Butterfly.)
This species' caterpillar has some tricks up its sleeve: It can resemble bird droppings but can also look like a tiny snake at a later stage of development.

The monarch butterfly chrysalis has what appear to be gold dots and threads, which help the developing insect blend in with leaves.

Female Monarch Butterfly: “The monarch butterfly (Danaus plexippus) is a
milkweed butterfly
(subfamily Danainae) in the family Nymphalidae. It may be the most
familiar North American butterfly. Its wings feature an easily recognizable orange and black
pattern, with a wingspan of 8.9–10.2 cm (3½–4 in),” Wikipedia says.
Photo Credit
: Kenneth Dwain Harrelson; May 29, 2007
Source: Wikipedia

Camouflage is an adaptive method, no doubt, that ensures that a particular species survives from being eaten by its natural predators.The list is long, and includes wasps, frogs, lizards and monkeys.The monarch butterfly (Danaus plexippus) has another adaptive mechanism, in addition to camouflage, that protects it to a large degree from being eaten. Monarchs taste bad, a result of caterpillars consuming milkweed before metamorphosing into butterflies.

One butterfly website says that this gives monarchs a chemical defence: “[T]hey sequester the poisonous cardenolides (also called cardiac glycosides) in the milkweed. Cardenolides are poisonous to vertebrates.” The bright colours (yellow, orange, black, and white) of the monarch also act a signal to potential predators that it has these chemical defences. There are predators, though, that can bypass the monrach’s chemical defences: two species of birds, the black-headed grosbeak, the black-backed oriole, and one species of mouse, the black-eared mouse.

I do not see as many monarchs as I used to in my childhood. I saw only a handful of monarchs in parks and at the botanical gardens last year; besides natural predators, monarchs face changes to their winter habitats in Mexico and a reduction in their food supply here in Canada and in the U.S. The World Wildlife Fund Canada says on its website:
Monarch butterflies are currently facing three major risks: illegal logging, lack of milkweed plants and climate change. WWF’s 2013-14 report from Mexico showed that the number of monarch butterflies wintering there was at its lowest in 20 years. This finding was determined by measuring the amount of forest they occupy; in 2013, the number of butterfly acres decreased to 1.65 acres compared to 27.5 acres in 2003.
There might be a way to increase the population of monarch butterflies that would make both human developers and the many butterfly lovers happy. Co-existence has an important place in the realm of ideas, both in the sciences and in the arts. It is not a matter of keeping the natural world pristine [see Primitivism], but of maintaining it to a sufficient degree as to not harm humans. This includes the appreciation of beauty and nature, a Wordsworthian idea, perhaps, but nevertheless steeped in longing and pleasures of the human kind [see Romanticism].

For more, go to [NatGeo]
nakes, toads, rats, lizards, dragonflies and even monkeys!  - See more at: http://www.thebutterflysite.com/what-eats-butterflies.shtml#sthash.5gIoT8aD.dpuf
nakes, toads, rats, lizards, dragonflies and even monkeys!  - See more at: http://www.thebutterflysite.com/what-eats-butterflies.shtml#sthash.5gIoT8aD.dpuf
Some of the common predators of butterflies include but are certainly not limited to: wasps, ants, parasitic flies, birds, snakes, toads, rats, lizards, dragonflies and even monkeys!  A few of the other animals that are constantly adding butterflies onto their menu list are frogs and spiders. - See more at: http://www.thebutterflysite.com/what-eats-butterflies.shtml#sthash.cuN3py8l.dpuf
Some of the common predators of butterflies include but are certainly not limited to: wasps, ants, parasitic flies, birds, snakes, toads, rats, lizards, dragonflies and even monkeys!  A few of the other animals that are constantly adding butterflies onto their menu list are frogs and spiders. - See more at: http://www.thebutterflysite.com/what-eats-butterflies.shtml#sthash.cuN3py8l.dpuf

Wednesday, March 11, 2015

Killing Cancer By Viruses

Cancer Research

This is a continuation of a post, “Helping the Body Self-Heal” (March 8, 2015).

This documentary, produced by “Vice Media,” aired originally on HBO on February 27, 2015; it shows how genetically modified forms of viruses like measles, smallpox and HIV can be used as a cancer-killing agent, chiefly by harnessing the human body’s natural defences to respond in attack mode to foreign and mutant cancer cells.

Shane Smith, the show’s narrator and founder of Vice Media, says the following astounding statement:
Today in real time there is a revolution happening in the treatment of cancer and the story is almost too incredible to believe. That (a) the diseases that used to kill us en masse like smallpox, measles, and even HIV actually holding the key to stopping the disease in its tracks and (b) that for the first time in medical history we just might be on the verge of curing cancer.”
Some hyperbole? Perhaps, but we have entered the golden age of cancer research, and these are exciting times for both researchers and patients.  It is important to understand that medical researchers have known for more than a century that the body’s immune system kicks into high gear as a result of fevers or infections, where macrophages and T-Cells swallow up mutant cancer cells  This knowledge is helping medical researchers to look at novel approaches to battle cancer that centre on immunotherapy, or helping the body’s robust immune system defeat cancer.

Monday, March 9, 2015

Studying Engineering

On Professions

McGill University's Roddick Gates: This is the main entrance to the university's downtown
campus at Sherbrooke Street West and McGill College Avenue. The Arts Building, the focal
, was built in 1839 and is the oldest building on campus. During my student days, I used to
give tours of the campus to prospective students and their parents. We often met at these gates,
and this is where the tour both began and ended.
Photo Credit & Source: HerCampus

Picking a profession can be difficult, a daunting task, at a young age. I knew from an early age—I think I was eight or nine—that I wanted to work in the field of applied sciences, and engineering fit the bill. A few years later, when I found out that my older cousin, Gordon, was studying mechanical engineering at McGill University, I was more certain of my initial choice of professions.

My dad did not want me to become an engineer, or at least this was his initial view; his desire was that I become an accountant or some profession related to business or commerce. In short, a nice “Jewish profession.” These were on the short list of good professions that also included doctor, lawyer, dentist and businessman, or what one would today call an entrepreneur. (I did have an entrepreneurial spirit at a young age, but more on that in another post.)

And, yet, he accepted engineering as a viable and good profession once he understood what it was that engineers did and how they were viewed in society.  He said, “So, you want to become an engineer; I spoke it over with my friends, and they said it was a good profession. Arbetn shver.” This was easy: I always believed in working hard to gain something good and important. Although I was initially bothered that he had to get the “blessing” of his lantsman, I was later relieved that he endorsed my choice of professions. Moreover, and equally important, I wanted my father to be proud of me and my accomplishments.

Engineering is one of the professions considered important for advancing society, one of the four cited in STEM education (science, technology, engineering and math). If you want to study engineering, you should be not only above-average in math, but comfortable in problem-solving and in working with mathematical equations; you will take courses in advanced calculus and in partial differential equations, which you will use in most of your courses, including thermodynamics, heat transfer,  and fluid mechanics.

I took a look at the current course listing for the bachelor of engineering program at McGill University’s Department of Mechanical Engineering; and the required courses are quite similar to those of my time, more than three decades ago. This would confirm my view that the fundamentals of engineering remain the same.

Academically, engineering is tough; you will have little social life and spend most of your time at class, at labs, at the library or at home doing assignments. You will often feel overloaded and tired. This is something that will become a part of your life. Engineering is dedicated to hard work and problem-solving. No matter what profession or career you follow later on, no matter what professional path you take later on, the discipline of an engineering education will serve you for life.

McGill’s Macdonald Engineering Building: After a fire gutted the original building, it was
rebuilt in 1907; although not in view here, there is a phoenix rising from the ashes carved
on the
south wall,
McGill says on its website, “as a reminder of the fire and a symbol of rebirth.” This
building is adjacent to the Milton Gates and is one of two buildings dedicated to an engineering
education, the other being the McConnell Engineering Building, built in 1959.

Photo Credit: Dallas Curow
Source: McGill Faculty of Engineering

Sunday, March 8, 2015

Helping The Body Self-Heal

Immunotherapy: It is known that spontaneous remission generally coincides with a fever or infection, which causes the body’s robust immune system to quickly and mercilessly respond to an attack; during this time the body's immune defenses, which includes macrophages and T-cells, quickly gobble up mutant cancer cells. Such is the current thinking in medical science..
Photo Credit: SPL
Source: BBC Future

An article, by David Robson, in BBC Future looks at the mysterious cases of people diagnosed with cancer who undergo what doctors call spontaneous remission, where the body cures itself without outside treatment such as chemo or drugs. This is a rare occurrence, happening to only one in 100,000 cancer patients.

Some might call these rare cases miracles, but what scientists do know is that in these cases the body's immune system kicked into high gear as a result of fevers or infections, where macrophages and T-Cells swallowed up mutant cancer cells  This knowledge is helping medical researchers to look at novel approaches to battle cancer that centre on immunotherapy, or helping the body’s robust immune system defeat cancer.

Robson writes in “Cancer: The mysterious miracle cases inspiring doctors (March 6, 2015)”:
Could infection be the key to stimulating spontaneous remission more generally? Analyses of the recent evidence certainly make a compelling case for exploring the idea. Rashidi and Fisher’s study found that 90% of the patients recovering from leukaemia had suffered another illness such as pneumonia shortly before the cancer disappeared. Other papers have noted tumours vanishing after diphtheria, gonorrhoea, hepatitis, influenza, malaria, measles, smallpox and syphilis. What doesn’t kill you really can make you stronger in these strange circumstances.

It’s not the microbes, per se, that bring about the healing; rather, the infection is thought to trigger an immune response that is inhospitable to the tumour. The heat of the fever, for instance, may itself render the tumour cells more vulnerable, and trigger cell suicide. Or perhaps it’s significant that when we are fighting bacteria or viruses, our blood is awash with inflammatory molecules that are a call to arms for the body’s macrophages, turning these immune cells into warriors that kill and engulf microbes – and potentially the cancer too. “I think the infection changes the innate immune cells from helping the tumours to killing them,” says Henrik Schmidt at Aarhus University Hospital in Denmark. That, in turn, may also stimulate other parts of the immune system – such as our dendritic cells and T-cells – to learn to recognise the tumorous cells, so that they can attack the cancer again should it return.

Schmidt thinks that understanding the process of spontaneous remission is vital, since it could help refine the emerging class of “immunotherapies” that hijack our natural defences to combat cancer. In one treatment, for instance, doctors inject some cancer patients with inflammatory “cytokines” in order to kick the immune system into action. The side effects – such as high fever and flu-like symptoms – are typically treated with drugs like paracetamol, to improve the patient’s comfort.

But given that the fever itself may trigger remission, Schmidt suspected that the paracetamol might sap the treatment’s potency. Sure enough, he has found that more than twice as many patients – 25% versus 10% – survive past the two-year follow-up, if they were instead left to weather the fever.

There could be many other simple but powerful steps to improve cancer treatment inspired by these insights. One man experienced spontaneous remission after a tetanus and diphtheria vaccination, for instance – perhaps because vaccines also act as a call to arms for the immune system. Along these lines, Rashidi points out that a receiving standard vaccine booster – such as the BCG jab against tuberculosis – seems to reduce the chance of melanoma relapse after chemotherapy.
Whether or not medical researchers ever fully understand the mechanisms of spontaneous remission is not as important as learning and benefiting from its results. It seems that cancer treatment in the form of immunotherapy is on the right track to help defeat cancer, and that our bodies will be doing most of the work, and without the need for treatments that have unwelcome and unpleasant side effects, is even better news. Over-all, the day when this becomes the normal and preferred mode of treatment can’t come too soon.

For more, go to [BBC Future]