Thursday, August 19, 2010

New Home

Well, blogspot has treated me well, but I'm moving on to a bigger host. Come check out www.mommiologist.com and let me know what you think!

Saturday, August 7, 2010

It's time


I had an unpleasant realization the other day. My friend Kerri Ann came over so I could sign her kids' passport applications, and I was excited to fill out forms. That's it. I need to go back to work. Part of it is that the summer is waning and as someone who spent roughly twenty four years in school, I always feel the urge to sign up for classes and buy school supplies in August. Part of it is also that as Hannah gets more mobile and interesting and sleeps less, I have less time to blog. Part of it is also guilt that I still have not finished THE EXPERIMENT THAT WILL NOT DIE!! But the biggest part is that though I love my little girl, I'm getting bored. I'm sorry Hannah. I love you dearly. But spending most of every day in the house is making me a little bit batty.

The poor girl. During a raucous exploration of the fun of ripping paper, she pulled a page out of a magazine, and then tried to put it back. I explained to her that she couldn't put a page back that has been ripped out, because time is unidirectional, and that entropy increases without an input of energy. Entropy is the force of disorder, much like Hannah. In her book Busy Bee and Friends she got a lecture on how Coleopterans are the most taxonomically diverse group of invertebrates on the planet, and that during the Carboniferous era dragon flies ruled the skies, at least as far as insects are concerned. And finally, I let her know that pears are gritty because they are full of sclerids, a cell type with very thick and jagged cell walls. The worst part? I learned all this so long ago I can't remember if all this is correct or not. Be sure to let me know.

I come by this earnest nerdiness honestly. Some of my earliest memories of conversations with my dad including an explanation of the speed of light and its implications for time travel. He encouraged me from a young age to become a marine biologist, read about biology and physics, and discuss anything nerdy. Because to us, the nerdy things are important.

So here I am now, nine months into a decidedly non-nerdy time. It's been WONDERFUL. Hannah is a bright, extraordinarily happy and delightful little girl, and watching her go from a largely unresponsive infant to a little person is fascinating. The cupboards and closets are organized, the spare room decorated, and I have planted a vegetable and shade garden. I've started this blog, and published the last paper from my PhD, and submitted freelance articles to a few different outlets.

But now, I'm done. My domestic chores are suffering: the lawn is getting long and I haven't made baby food in weeks. Readers, I am BORED.

I worry, of course, about putting Hannah in daycare. Before her birth she was put on numerous waiting lists for day cares and it's a gut-wrenching and nerve-wracking decision to make. But unlike a lot of women I don't worry about how having a working mother will affect her. My mother worked our whole lives, and gave my sister and I an increased independence and drive to have our own careers. I always felt that if you tell your little girl that she can be all she wants to be, then it's also imperative that you live that message yourself - regardless of whether your dream is to be a CEO or a stay-at-home mom.

Kismet intervened in my life, as it so often does, as I was reading the Globe and Mail the other day. There was an article by Leah McLaren on guilt in working mothers. She references a 2009 study looking at outcomes of children of working mothers (Joshi et al. 2009. Combining Child Rearing With Work: Do Maternal Employment Experiences Compromise Child Development? CLS Working Paper 2009/1). In particular the authors were interested in whether the intensity of the mother's work, during the second half of the first year of the child's life, affected the development of children's cognitive ability. In short, working mothers did not have intellectually or emotionally stunted children. They were fine. I don't find it surprising, and I don't think that the mothers I know, working and not, find it surprising either. There are many ways to raise a happy, healthy child, and the happiness of the mother is hugely important.

So it's going to be an interesting week for this new mom: I'm supposed to meet with my old bosses to discuss a return to work, and I also had a job interview for a position in a biotech company. It has felt wonderful, talking science again, even if it took me about 10 minutes to remember the phrase "optimal foraging." My blissful first year as a mom is entering its last quarter, and now I'm starting to have to reconcile the personal and professional sides of me. Things are going to get interesting, but I'm looking forward to a bit more balance. And I think Hannah, who really just wants to play, will be looking forward to a break from the lectures from mom too.

Sunday, July 25, 2010

Who's imitating who?


Hannah's turned into a great eater. And before you think that this blog has degenerated into self-satisfied mommydom where I list off her preferred foods and tell you how cute it is when she can't quite get something in her mouth (sweet potato, chicken and avocado, and the self-feeding thing IS hilarious) stick with me. She wasn't always a great eater, in fact she was quite picky until a stampede breakfast of pancakes and yogurt got her going - oh stampede, you provide so much culinary delight! But I also think that a big part of what encouraged her to eat was something my mom teased me about - I open my mouth when I want her to, and I don't even know I do it. My mom does it too, and I'm starting to think it's a universal mom thing. So, of course, I checked google scholar.

There isn't a lot of research specifically on why mothers open their own mouths to get their babies to eat, but let's be honest, the topic doesn't scream Nobel Prize. However, what is fascinating is research showing just how important social cues are to infant development. Most baby books emphasize how emotionally immature infants are - they cannot smile, cannot see their parents' faces clearly, and in fact have no awareness of their own faces or limbs. However, research has shown that infants as young as twelve days are capable of imitating both facial and manual gestures (Meltzoff and Moore, 1977, Science 4312:75). This implies that they have an awareness of others and their own self that was previously thought to have taken months to develop.

Imitation, even in adults, turns out to be more common than we like to think. How many of us have been asked what kind of accent we have by someone with an accent, only to sheepishly realize that we've been unconsciously copying it? Compulsive imitation can occur in stroke patients who have a lesion in their frontal lobes (Lhermitte, F. et al. 1986, Annals of Neurology, 19:326), which suggests that their ability to voluntarily control their imitative behaviour is compromised. However, it was experiments in macaques that showed just how important imitation in is a social species.

In the early 1990s, Giacomo Rizolatti and coworkers were surprised to discover that certain neurons within the brains of macaques activated not only when the monkeys performed a simple task like grasping food, but also when they watched someone else (macaque or human) doing the same task. In other words, perception of an activity resulted in activation of the neurons that would perform that task. The authors termed them mirror neurons. In 1996 a similar phenomenon was observed in humans using PET imaging; when someone grasping an object, subjects showed activation in not only the visual cortex, but in motor areas that would be involved in grasping (Rizolatti and Fabbri-Destro, 2010, Experimental Brain Research, 200:223). This observation revolutionized our understanding of cognition; previously it was thought that the observation of an act required higher order processing in order to be understood. This may still happen for complex tasks, but simple tasks elicit an extreme empathy - the observer mentally imitates the object of her vision.

And in fact mirror neurons may have a lot to do with empathy. When interacting with people, we unconsciously mimic our conversational partner. Children diagnosed with autism, a disorder that includes deficits in empathy, do not mimic others when interacting, and fMRI studies have shown that areas of the brain that become active during observation in normally developing children were underactivated, or silent, in children with autism (Rizolatti and Fabbri-Destro, 2010, Experimental Brain Research, 200:223). In fact, being unable to mimic an expression may impair one's ability to empathize: a study currently under review suggests that when frozen by botox and rendered unable to frown, subjects were impaired in their ability to interpret sentences that described sad situations (Havas et al. 2009).

Do infants mirror the adults around them in the same way? It has been established that infants mimic their parents at an astonishingly early age, but this could simply be reflex. Imitation in infants starts to lessen after about five months, which is when other reflexive behaviours also start to wane. However, infants have been shown to mimic not only when they see an action, but also when they hear a stimulus associated with that action. This suggests that there might be something more complex going on than simple reflexive mimicry (Bertenthal and Longo, 2007, Developmental Science, 10:526).

Despite the tantalizing clues, there remains a great deal of uncertainty in this area. Mirror neurons have not been directly observed in humans in the same manner that they have in monkeys. Though the name is catchy, it's not clear whether there are specific neurons in the brain that perform mirror tasks, or whether this is a more generalized phenomenon. However, it is clear that there's a neural basis to empathy that is more specific and more important than we had previously supposed. The findings that we reflexively try on the emotions of those with whom we communicate suggest that empathy is not simply a nicety but an integral part of how social beings interact. In a bond as strong as that between mother and child, the ties of empathy are crucial, and if it means that mother has to look silly, with phantom bites at an invisible spoon, then so be it.

Friday, July 23, 2010

Busy Week


I haven't had time to post all week - but I will give you a quick update. I am very lucky - this week has been taken up by festivities for weddings for two friends. Alexis and Carey are getting married in Nova Scotia, so we can't go, but I spent the weekend in Jasper drinking too much with Alexis and some other friends. And between the wedding of Yvonne and Tim and my friend Rebecca's ride in support of Cystic Fibrosis research (I told you about Brian's ride to conquer cancer, but Rebecca also rode her bike from Vancouver to Calgary - incredible!) I have been able to see my friends from grad school. Grad school was an ordeal in and of itself, but in those years we all had our innings with breakups, marriages, divorces, babies, sick parents, loss of parents, moving, new jobs, and new jobs again. They're a special group - and not just for the history we share. They're funny. Really funny. All of them. That's a rare thing - people who are truly witty - so I'm grateful.

Should be an interesting weekend!

Tuesday, July 13, 2010

Human Nature

There have been a couple of articles lately that have given me pause because of their poor interpretation of human nature. I can be a misanthrope with the best of them, and certainly living in the reactionary-right-wing bastion of Canada, at the end of the Bush years and soon after the G20, does nothing to improve my opinion of human kind. But I'm also a mom to a darling little girl, who's going to have the earth I leave her, so the nature of the dominant species on this planet is an important question to me.

In Scientific American, John Horgan discusses the book Mothers and Others by Sarah Blaffer Hrdy (Our nature is nurture: Are shifts in child-rearing making modern kids mean?). Sarah Blaffer Hrdy is an evolutionary biologist/anthropologist, who really sounds like someone I might like. Dr. Hrdy worked as an anthropologist during the years when women's contributions to science were not readily accepted, and also raised three children. Her book emphasizes a type of child-rearing that allowed her to pursue her career - allocare, or the assistance of non-related members of the troop. Group child rearing differentiates us from the rest of the great apes, who rear their offspring individually.

In his article, John Horgan then goes on to discuss modern human societies. Many of today's children receive most of their care from non-kin, and often times care from the parents is distant, which can result in disorganized attachment. Might this then result in a loss of empathy in our species, if empathy is not selected for? He then cites a recent study showing that empathy in college students has suffered a precipitous decline in the last 30 years.

I'm going to go out on a limb here and suggest that Sarah Blaffer Hrdy probably didn't say this in her book, because no evolutionary biologist would suggest that any phenotype can be selected against in roughly one or two generations. Not only is 30 years much too short a time frame for the evolutionary loss of any phenotype, but it would require that empathy be strongly selected against. Certainly the misanthrope in me thinks this is the case in modern society, but in reality empathy is required for the most basic of human interaction and communication, so its selection against seems unlikely to me. But it isn't the science of this article that upset me. I wasn't really sure, however, what was bothering me.

Then tonight I reread an old issue of Discover and came across an article by one of my favorite columnists, Bruno Maddox. He shares my love of Ghost Hunters and wrote eloquently on one of Darwin's bigger mistakes. In "The Body Shop" (Discover, May 2010, pg 43) he discusses the rise of the robot, and wonders why people aren't as disturbed by robotics as they used to be. A large body of 20th century literature examined the fine line between the potential advantages and dangers of robotics, and until just recently, we all seemed to be terrified of the thought of robots taking over the earth. Now, as robots adopt human-like faces, vacuum our floors, and take over the battlefield, we seem to have quietly given up our fears of being made obsolete. Why? Maddox suggests that this is because we've really stopped thinking we're any good at all. The basis of our fears of robots was our belief in unique nobility of humans. Maybe, however, in the age of the internet, we suddenly realize that humans are largely stupid. "[Man] enjoys porn and photographs of cats on top of things. He spells definitely with an a, for the most part, and the possessive its with an apostrophe. On questions of great import or questions of scant import, he chooses sides based on what, and whom, choosing that particular side makes him feel like, and he argues passionately for his cause, all the more so after facts emerge to prove him a fool, a liar, and a hypocrite." Perhaps then, our fear of robots has been replaced with a hope that they can be programmed to be the intelligent, rational, even empathic beings that we have failed to be.

And there it is. This particularly Judeo-Christian idea of the fallen angel. That mankind's nature is inherently selfish and irrevocably sundered from our noble origins. This philosophy has coloured western philosophy for centuries; Thomas Hobbes' depiction of life as naturally "solitary, poor, nasty, brutish and short," Garret Hardin's Tragedy of the Commons. I can't underestimate how much this philosophy has coloured the thinking of the west, and we hardly seem to question it. But we should question it, strongly. Because in a world where humans are basically jerks it becomes very difficult to put in the energy to make it better. It makes it easier to care less for our neighbours, and easier to accept injustice.

It would be easy to categorize my objection to this poor view of human nature as a womanly or sentimental instinct to protect my child, and to be perfectly honest, that is part of it. No parent wants their child to grow up in a society that is irredeemably selfish. But the bulk of my objection comes from the fact that this cynical view of humanity is just too easy. It's simple to look at the state of the world and roll your eyes. Cynicism takes no courage and leaves no room for outrage.

The cynic in me doesn't have a hard time accepting that college students of today are less empathetic than they were thirty years ago, but the optimist in me doubts strongly that this is due to some genetic decline. Instead it seems clear the problem stems from a culture that strongly rewards self interest and devalues idealism. This is not the real tragedy, however. The real tragedy is that, like a rapacious economy and lying politicians, we think that this is the natural, immutable progression of our society. There is no scientific or anthropological reason to suggest that humans societies are limited by rapacious self interest, so let's not give into a poor philosophical excuse.

Thursday, July 8, 2010

Delicious

Oh, the science gods are smiling on me today. No, I have not finished THE EXPERIMENT THAT WILL NOT DIE, but it's getting there. And no, I have not found a new job. Stop asking. No, it's even better. Someone high up provided me with an opportunity to indulge that smug instinct present in all nerdy kids, that know-it-all sense of superiority and absolute glee in someone else's intellectual misstep that got me to where am I today - the opportunity to say "NO, that's WRONG. You IDIOT."

I signed up for twitter when I started blogging because apparently it's what bloggers do. I log in sporadically and face that "What's happening?" box with a stupid look and inanely enter in something self-indulgently referential to either my blog or Hannah. (Follow me on twitter! I'm the mommiologist and I promise to be interesting from here on!) However, I am actually enjoying my updates from the likes of the Cassini spacecraft, Jane Goodall, Barack Obama, and Stephen Colbert. And today, I got this tweet (is that what the kids are calling it these days?): " NatGeoSociety #Video: Why do cats always land on all fours? http://on.natgeo.com/cAHSh9 #animals"

Oh National Geographic, you idiot.

We all know that cats land on all fours. Our childhood cat Fluffy was rendered a drooling, nocturnal recluse by a friend of the family's attempt to prove this by launching her off the dining room table. She landed on all fours physically, but maybe not so much emotionally. Other people still seem to think cat-throwing is a viable scientific endeavor, and even manage to get ethics approval for the study that provided the video in the clip.

In 1987, a vet named Dr. Michael Garvey at the Animal Medical Center in New York noticed an interesting trend: as expected, there's a strong relationship between the number of injuries a cat sustains and the number of stories from which it falls, but only up to roughly 6 floors. Cats who fall from higher heights tend to have fewer injuries, which seems strongly counter intuitive. Dr. Garvey explained this phenomenon by saying that this height allows cats time to a) reach terminal velocity and b) right themselves and c) relax. Hitting the ground relaxed, even at terminal velocity, hurts a lot less than hitting it tense, apparently.

I remember, back in the dim days of the last century when I was an undergraduate, discussing this case in a class called Ecological Methods. The professor, a sarcastic, slightly bitter hippy type who probably had a very bad case of the know-it-alls as a child, went through the whole story, then asked: "can anyone see what's wrong with this study?"

Silence.

Finally, a smarty-pants from the back of the class asked: "Did anyone count the dead cats?"

Consider humans falling from a window: regardless of whether the person survives, there will be considerable paperwork. Definitely a record of some sort. However, if your cat falls from a window and you reach it and it's an ex-cat, joined the choir eternal, you aren't going to pay good money to have a vet declare it dead. You will get a shovel. The author of this study, however, took his data from vet clinic admission records, so you can see how a large subset of the data had been missed. Maybe those cats that fell ten stories and bounced off an awning before hitting the ground skewed the results a little.

I thought this story was dead and only good for inducing smug giggles in other scientists, but no! So thank you, National Geographic, thank you.

Monday, July 5, 2010

More Genetic Conflict

I have posted before on my miserable pregnancy and delicately blamed my darling little girl for it. But she's cute, and I like her, so I really can't blame her for anything. Her father though..... Brian's wonderful, an amazing father, and truly supportive. He's also cute too. But I just read an article that suggests that he might have a darker side. You see, it might have been overexpression of his genes in the placenta that led to my crummy pregnancy. It's also possible that underexpression of genes derived from my side were at fault, but let's not blame the victim.

At first, it doesn't seem that the idea that expression of genes in the placenta can depend on which parent they came from would be revolutionary, but if you think about it for a minute, you can see that Mendelian inheritance cannot explain why this should happen. A gene coding for, say, development of an eyeball codes for the development of an eyeball, regardless of whether it came from the father or the mother. Outside of the sex chromosomes, there is nothing in the genome that tells the infant which parent it was derived from. However, you can see how from a Selfish Gene perspective this could be useful - what if you were a male, and decided that the death of the mother wasn't a big deal if you got a healthy baby that carried half your genes? Hopefully, if you were a human male, you'd be thrown in jail and sterilized for such behavior. However, nature is red in tooth and claw and if you're a gene, then such an influence might spread - but only if it was always passed from the male. This requires a way to temporarily modify gene expression, to turn a gene on only if it is derived from the father. A genetic change that is lasting enough for a generation, but not any more. This is done by epigenetic modification of the DNA.

In order to understand why epigenetic inheritance is so weird, and so revolutionary to the field of genetics, we have to go back to the Origin of evolutionary theory - pardon the pun. Before inheritance was completely understood, there were two major competing theories as to how evolution happened. We now understand that genes are the mode of inheritance. We know that they are inherited from our parents, and while smoking or sunburns might result in random mutations, there is no way that we can alter our genome to affect our future generations. This is in contrast to the idea that an individual may acquire characteristics during their lifetime that may be passed on to their offspring. The most famous example of inheritance of acquired characters, or Lamarckian inheritance, is the idea that the giraffe got its long neck by stretching its head to reach leaves. Thus a giraffe that really stretched its neck out would have offspring with long necks as well. This theory, proposed by Jean-Baptiste Lamarck, was disproven by August Weismann in the 1880s, when he showed that inheritance is only possible through the gametes (egg and sperm), as opposed to somatic cells (cells everywhere else). Thus any changes to cells within the body cannot be passed on to the germ cells.

This was the understanding of genetic medicine for years. Genetic disorders are possible, but arise and evolve over thousands of generations due to the breeding success and failure of thousands of individuals. However, a seminal series of papers collectively called the Overkalix study showed how experiences in individual's lifetime could result in disease within one or two generations. In the Overkalix parish in remote northern Sweden is a population that has endured repeated sudden famines, and equally sporadic bumper crops, over the last two hundred years. The study showed that if a boy during experienced a period of plenty during his preteen years, his grandson was more likely to die of cardiovascular disease. This goes against everything we know about Mendelian inheritance: the effect is within two generations, and is only passed along males.

This sounds suspiciously like Lamarckian inheritance. In fact, in this case inheritance is mediated not by changes to the genetic code, but changes to the proteins associated with a string of DNA, or by epigenetic changes. The environment can trigger changes to the proteins that fold DNA, or cause chemical tags to be added to the DNA backbone. Either of these DNA modifications can "imprint," or change the expression of certain genes, and in the case of epigenetic inheritance, can alter how genes are expressed in subsequent generations.

In the case of preeclampsia, the disorder is likely caused by imperfect imprinting. Preeclampsia is a disorder that affects 3-7% of pregnancies, and is characterized by an increased blood pressure and protein in the urine. Left untreated, it can develop into HELLP disorder, which stands for Hemolytic anaemia, Elevated Liver enzymes, and Low Platelets, which can cause seizures and death of the mother. Preeclampsia is often inherited maternally, which suggests that the problem comes down the female side, but there are also paternally derived genes that have been implicated. So when it comes down to it, there is no way to determine whether my crappy pregnancy was the fault of Brian's genes or mine. But I am definitely heartened by the fact that if I had to have a disorder, at least it was a genetically interesting one.