While the dead don't care, the dead matter. The dead matter to the living. And while getting the dead where they need to go, we help the living get where they need to go, too. Thomas Lynch, "The Undertaker" (Frontline)
My water broke just after midnight on Dec 26, 1987. We left our 2 yr old daughter in the capable hands of a family friend, and trekked through the unseasonably warm, rainy night to the hospital. Our doctor met us there in the emergency room. He was a GP, back when some GP's still delivered babies. He was a kind and gentle man.
All the excitement started to fade as he checked me. After a while he said he was ordering an ultrasound. "I don't know if the baby is head first. I think I feel hair, but I don't feel the skull." I laid there, and Will held my hand. We were quiet. While the technician was doing the ultrasound, he asked me a few questions, but one question has remained in my memory because of the way it made my heart freeze. "I understand you have a healthy child at home?"
Yes, we had a healthy child at home. This question was the first real understanding that we didn't have a healthy child in utero.
The technician and the doctor left, obviously consulting on the findings. I started to cry and told my husband "I guess we'll be doing this again." An odd thing to say, I know.
About 10 minutes later our doctor returned. He didn't have the poker face down. His eyes were red and he was struggling somewhat to keep his composure.
"The baby has some major problems. He has a fully developed brain, but no skull. His lungs are not fully developed, and his heart is not formed correctly. He cannot survive. The baby is going to die."
The next 6 or 7 hours are a bit of a blur. My labor was weak so they started me on pitocin. Not fun. I asked for an epidural, thinking I was not emotionally capable of going through labor knowing my child would die after birth. But this was a holiday night and there was only one anesthesiologist at the hospital, and he was busy in the emergency room with a car wreck. I would do this on my own, and in retrospect, I'm glad I did. I remember watching TV, there was a horse show on ESPN. I love horses and I watched them gracefully circle the ring taking impossibly high fences for over an hour, losing myself in their rhythm and motion.
Toward dawn we started the phone calls, even as I labored toward the final hour or so. My mom, my best friend, our elder. "Mom, the baby isn't going to make it." "Susan, the baby isn't going to make it." "Please pray for strength."
As I moved into transition, an ultrasound technition arrived to do a high definition ultrasound of the baby. Looking back later I had to laugh at the absurdity of it. I laid on my back enduring the worst labor pains so he could get a good image of a sick baby that would be here in an hour or less. He could have waited and just looked at the baby itself! But, of course, I was in a teaching hospital. I shouldn't be surprised. And Will and I weren't in any shape to question.
At the end of the ultrasound the anesthesiologist came in. I shook my head, looked at my doctor and said, "I have to push!" Isaac William made his very short debut a few minutes later. He moved very little. The nurse and doctor were in tears. My nurse was Catholic and asked permission to baptize Isaac. I said yes. We all needed comfort, and I wouldn't deny her that. Anyway, I've always like that we baptized him, Isaac William G., I baptize you in the name of the Father, and of the Son, and of the Holy Spirit. Amen. Isaac died in my arms after only 10 minutes or so. Our doctor prayed for all of use. God was there in that room. Isaac stayed with us for almost an hour. And then we said good-bye.
We allowed the hospital to take tissue for research. Isaac was diagnosed with "Osteogenesis Imperfecta, Perinatal Lethal Form." We found out later during genetic counseling that it was most likely a dominant new mutation and there was little or no chance we'd have another child with the disorder. Since I am a biologist with an emphasis in genetics, I actually understood what the counselors were telling us.
But I jump ahead of the story. Since I had just delivered a baby, they sent me to the maternity ward. Not a good move. I could hear them wheeling the babies to the other mothers. I asked to go home. The hardest thing I've ever done is leave that hospital without my baby.
We didn't have a funeral or a memorial service for Isaac. It was the day after Christmas, and I remember thinking, "This will be so hard on everybody, trying to change plans to come for a funeral." I didn't even know where to begin, and we had almost no money. But I regret that decision.
Oh, I don't mope about it. But I now know the importance of the dead to the living. Other people besides Will and I needed to meet and say goodbye to Isaac. If I'd known then what I know now...
So I dedicate this to the memory of my son, Isaac William. Born and reborn December 26, 1987. As I look over my five surviving children, I still see the gap where you should be. My visitor from heaven.
VISITOR FROM HEAVEN - 1993
A visitor from Heaven
If only for a while
A gift of love to be returned
We think of you and smile
A visitor from Heaven
Accompanied by grace
Reminding of a better love
And of a better place
With aching hearts and empty arms
We send you with a name
It hurts so much to let you go
But we’re so glad you came
We’re so glad you came
A visitor from Heaven
If only for a day
We thank Him for the time He gave
And now it’s time to say
We trust you to the Father’s love
And to His tender care
Held in the everlasting arms
And we’re so glad you’re there
We’re so glad you’re there
With breaking hearts and open hands
We send you with a name
It hurts so much to let you go
But we’re so glad you came
We’re so glad you came
Tuesday, October 30, 2007
Monday, October 29, 2007
Saturday, October 27, 2007
The Folly of Multitasking...
This is an article from The Week. It was in the regular column called The Last Word. It's a bit long, but humorous and enlightening. Discussing the article with my husband led to a talk about how to set up "flow" times...uninterrupted times in our days where we can focus on one thing and get it done well. That has led me to think of ways I can organize my week into a sort of block schedule that allows me to stop multitasking so much. If the research sited in the article is true, then it might be one reason I'm battling mild depression and short term memory glitches.
The Last Word
The folly of multitasking
Our cell phones and computers had us convinced we could do five things at once. But neuroscience, says novelist Walter Kirn, is now finding that the mental gymnastics required actually dumbs us down.In the Midwestern town where I grew up (a town so small that the phone line on our block was a “party line” well into the 1960s), there were two skinny brothers in their 30s who built a car that could drive into the river and become a fishing boat. My pals and I thought the car-boat was a wonder. A thing that did one thing but also did another thing—especially the opposite thing, but at least an unrelated thing—was our idea of a great invention and a bold stride toward the future. Where we got this idea, I’ll never know, but it caused us to envision a world-tocome teeming with crossbred, hyphenated machines. Refrigerator–TV sets. Dishwasher– air conditioners. Table saw– popcorn poppers. Camera-radios.
With that last dumb idea, we were getting close to something, as I’ve noted every time I’ve dropped or fumbled my cell phone and snapped a picture of a wall or the middle button of my shirt. Impressive. Ingenious. Yet juvenile. Arbitrary. And why a substandard camera, anyway? Why not an excellent electric razor? Because (I told myself at the cell phone store in the winter of 2003, as I handled a feature-laden upgrade that my new contract entitled me to purchase at a deep discount that also included a rebate) there may come a moment on a plane or in a subway station or at a mall when I and the other able-bodied males will be forced to subdue a terrorist, and my color snapshot of his trussed-up body will make the front page of USA Today and appear at the left shoulder of all the superstars of cable news.
While I waited for my date with citizenjournalist destiny, I took a lot of self-portraits in my Toyota and forwarded them to a girlfriend in Colorado, who reciprocated from her Jeep. Neither one of us almost died. For months. But then, one night on a snowy two-lane highway, while I was crossing Wyoming to see my girl’s real face, my phone made its chirpy you-have-a-picture noise, and I glanced down in its direction while also, apparently, swerving off the pavement and sailing over a steep embankment toward a barbed-wire fence.
It was interesting to me—in retrospect, after having done some reading about the frenzied activity of the multitasking brain—how late in the process my prefrontal cortex, where our cognitive switchboards hide, changed its focus from the silly phone (Where did it go? Did it slip between the seats? I wonder if this new photo is a nude shot or if it’s another one from the topless series that seemed like such a breakthrough a month ago but now I’m getting sick of) to the important matter of a steel fence post sliding spear-like across my hood. The laminated windshield glass must have been high quality; the point of the post bounced off it, leaving only a starshaped surface crack. But I was still barreling toward sagebrush, and who knew what rocks and boulders lay in wait.
Five minutes later, I’d driven out of the field and gunned it back up the embankment onto the highway and was proceeding south, heart slowing some, satellite radio tuned to a soft-rock channel called the Heart, which was playing lots of soothing CĂ©line Dion.
“I just had an accident trying to see your picture.”
“Will you get here in time to take me out to dinner?”
“I almost died.”
“Well, you sound fine.”
“Fine’s not a sound.”
I never forgave her for that detachment. I never forgave myself for buying a camera phone. We all remember the promises. The slogans. They were all about freedom, liberation. Supposedly we were in handcuffs and wanted out of them. The key that dangled in front of us was a microchip. “Where do you want to go today?” asked Microsoft in a mid-1990s ad campaign. The suggestion was that there were endless destinations—some geographic, some social, some intellectual—that you could reach in milliseconds by loading the right devices with the right software. It was further insinuated that where you went was purely up to you, not your spouse, your boss, your kids, or your government.
Autonomy through automation. This was the embryonic fallacy that grew up into the monster of multitasking. Human freedom, as classically defined (to think and act and choose with minimal interference by outside powers), was not a product that firms like Microsoft could offer, but they recast it as something they could provide. A product for which they could raise the demand by refining its features, upping its speed, restyling its appearance, and linking it up with all the other products that promised freedom, too, but had replaced it with three inferior substitutes that they could market in its name: Efficiency, convenience, and mobility.
For proof that these bundled minor virtues don’t amount to freedom but are, instead, a formula for a period of mounting frenzy climaxing with a lapse into fatigue, consider that “Where do you want to go today?” was really manipulative advice, not an open question. “Go somewhere now,” it strongly recommended, then go somewhere else tomorrow, but always go, go, go—and with our help. But did any rebel reply, “Nowhere. I like it fine right here”? Did anyone boldly ask, “What business is it of yours?” Was anyone brave enough to say, “Frankly, I want to go back to bed”?
Maybe a few of us. Not enough of us. Everyone else was going places, it seemed, and either we started going places, too— especially to those places that weren’t places (another word they’d redefined) but were just pictures or documents or videos or boxes on screens where strangers conversed by typing—or else we’d be nowhere (a location once known as “here”) doing nothing (an activity formerly labeled “living”). What a waste this would be. What a waste of our new freedom. Our freedom to stay busy at all hours, at the task—and then the many tasks, and ultimately the multitask—of trying to be free. It isn’t working, it never has worked.
The scientists know this too, and they think they know why. Through a variety of experiments, many using functional magnetic resonance imaging to measure brain activity, they’ve torn the mask off multitasking and revealed its true face, which is blank and pale and drawn. Multitasking messes with the brain in several ways. At the most basic level, the mental balancing acts that it requires—the constant switching and pivoting—energize regions of the brain that specialize in visual processing and physical coordination and simultaneously appear to shortchange some of the higher areas related to memory and learning. We concentrate on the act of concentration at the expense of whatever it is that we’re supposed to be concentrating on.
What does this mean in practice? Consider a recent experiment at UCLA, where researchers asked a group of 20- somethings to sort index cards in two trials, once in silence and once while simultaneously listening for specific tones in a series of randomly presented sounds. The subjects’ brains coped with the additional task by shifting responsibility from the hippocampus—which stores and recalls information—to the striatum, which takes care of rote, repetitive activities. Thanks to this switch, the subjects managed to sort the cards just as well with the musical distraction— but they had a much harder time remembering what, exactly, they’d been sorting once the experiment was over.
Even worse, certain studies find that multitasking boosts the level of stress-related hormones such as cortisol and adrenaline and wears down our systems through biochemical friction, prematurely aging us. In the short term, the confusion, fatigue, and chaos merely hamper our ability to focus and analyze, but in the long term, they may cause it to atrophy.
The next generation, presumably, is the hardest-hit. They’re the ones way out there on the cutting edge of the multitasking revolution, texting and instant messaging each other while they download music to their iPod and update their Facebook page and complete a homework assignment and keep an eye on the episode of The Hills flickering on a nearby television. (A recent study from the Kaiser Family Foundation found that 53 percent of students in grades seven through 12 report consuming some other form of media while watching television; 58 percent multitask while reading; 62 percent while using the computer; and 63 percent while listening to music. “I get bored if it’s not all going at once,” said a 17-year-old quoted in the study.)
They’re the ones whose still-maturing brains are being shaped to process information rather than understand or even remember it. This is the great irony of multitasking— that its overall goal, getting more done in less time, turns out to be chimerical. In reality, multitasking slows our thinking. It forces us to chop competing tasks into pieces, set them in different piles, then hunt for the pile we’re interested in, pick up its pieces, review the rules for putting the pieces back together, and then attempt to do so, often quite awkwardly. (Fact: A brain attempting to perform two tasks simultaneously will, because of all the back-and-forth stress, exhibit a substantial lag in information processing.)
Productive? Efficient? More like running up and down a beach repairing a row of sand castles as the tide comes rolling in and the rain comes pouring down. Multitasking, a definition: “The attempt by human beings to operate like computers, often done with the assistance of computers.” It begins by giving us more tasks to do, making each task harder to do, and dimming the mental powers required to do them. It finishes by making us forget exactly how on earth we did them (assuming we didn’t give up, or “multiquit”), which makes them harder to do again.
After the near-fatal consequences of my 2003 decision to buy a phone with a feature I didn’t need, life went on—and rather rapidly, since multitasking eats up time in the name of saving time, rushing you through your two-year contract cycle and returning you to the company store with a suspicion that you didn’t accomplish all you hoped to after your last optimistic, euphoric visit.
“Which of the ones that offer rebates don’t have cameras in them?”
“The decent models all do. The best ones now have video capabilities. You can shoot little movies.” I
wanted to ask, Of what? Oncoming barbed wire? I shook my head. I was turning down whiz-bang features for the first time. “I’ll take the fat little free one,” I told the salesman.
“The thing’s inert. It does nothing. It’s a pet rock.”
I informed him that I was old enough to have actually owned a pet rock once and that I missed it.
From a longer essay that appears in November’s The Atlantic Monthly. © 2007 by The Atlantic Monthly Group. Distributed by Tribune Media Services.
The Last Word
The folly of multitasking
Our cell phones and computers had us convinced we could do five things at once. But neuroscience, says novelist Walter Kirn, is now finding that the mental gymnastics required actually dumbs us down.In the Midwestern town where I grew up (a town so small that the phone line on our block was a “party line” well into the 1960s), there were two skinny brothers in their 30s who built a car that could drive into the river and become a fishing boat. My pals and I thought the car-boat was a wonder. A thing that did one thing but also did another thing—especially the opposite thing, but at least an unrelated thing—was our idea of a great invention and a bold stride toward the future. Where we got this idea, I’ll never know, but it caused us to envision a world-tocome teeming with crossbred, hyphenated machines. Refrigerator–TV sets. Dishwasher– air conditioners. Table saw– popcorn poppers. Camera-radios.
With that last dumb idea, we were getting close to something, as I’ve noted every time I’ve dropped or fumbled my cell phone and snapped a picture of a wall or the middle button of my shirt. Impressive. Ingenious. Yet juvenile. Arbitrary. And why a substandard camera, anyway? Why not an excellent electric razor? Because (I told myself at the cell phone store in the winter of 2003, as I handled a feature-laden upgrade that my new contract entitled me to purchase at a deep discount that also included a rebate) there may come a moment on a plane or in a subway station or at a mall when I and the other able-bodied males will be forced to subdue a terrorist, and my color snapshot of his trussed-up body will make the front page of USA Today and appear at the left shoulder of all the superstars of cable news.
While I waited for my date with citizenjournalist destiny, I took a lot of self-portraits in my Toyota and forwarded them to a girlfriend in Colorado, who reciprocated from her Jeep. Neither one of us almost died. For months. But then, one night on a snowy two-lane highway, while I was crossing Wyoming to see my girl’s real face, my phone made its chirpy you-have-a-picture noise, and I glanced down in its direction while also, apparently, swerving off the pavement and sailing over a steep embankment toward a barbed-wire fence.
It was interesting to me—in retrospect, after having done some reading about the frenzied activity of the multitasking brain—how late in the process my prefrontal cortex, where our cognitive switchboards hide, changed its focus from the silly phone (Where did it go? Did it slip between the seats? I wonder if this new photo is a nude shot or if it’s another one from the topless series that seemed like such a breakthrough a month ago but now I’m getting sick of) to the important matter of a steel fence post sliding spear-like across my hood. The laminated windshield glass must have been high quality; the point of the post bounced off it, leaving only a starshaped surface crack. But I was still barreling toward sagebrush, and who knew what rocks and boulders lay in wait.
Five minutes later, I’d driven out of the field and gunned it back up the embankment onto the highway and was proceeding south, heart slowing some, satellite radio tuned to a soft-rock channel called the Heart, which was playing lots of soothing CĂ©line Dion.
“I just had an accident trying to see your picture.”
“Will you get here in time to take me out to dinner?”
“I almost died.”
“Well, you sound fine.”
“Fine’s not a sound.”
I never forgave her for that detachment. I never forgave myself for buying a camera phone. We all remember the promises. The slogans. They were all about freedom, liberation. Supposedly we were in handcuffs and wanted out of them. The key that dangled in front of us was a microchip. “Where do you want to go today?” asked Microsoft in a mid-1990s ad campaign. The suggestion was that there were endless destinations—some geographic, some social, some intellectual—that you could reach in milliseconds by loading the right devices with the right software. It was further insinuated that where you went was purely up to you, not your spouse, your boss, your kids, or your government.
Autonomy through automation. This was the embryonic fallacy that grew up into the monster of multitasking. Human freedom, as classically defined (to think and act and choose with minimal interference by outside powers), was not a product that firms like Microsoft could offer, but they recast it as something they could provide. A product for which they could raise the demand by refining its features, upping its speed, restyling its appearance, and linking it up with all the other products that promised freedom, too, but had replaced it with three inferior substitutes that they could market in its name: Efficiency, convenience, and mobility.
For proof that these bundled minor virtues don’t amount to freedom but are, instead, a formula for a period of mounting frenzy climaxing with a lapse into fatigue, consider that “Where do you want to go today?” was really manipulative advice, not an open question. “Go somewhere now,” it strongly recommended, then go somewhere else tomorrow, but always go, go, go—and with our help. But did any rebel reply, “Nowhere. I like it fine right here”? Did anyone boldly ask, “What business is it of yours?” Was anyone brave enough to say, “Frankly, I want to go back to bed”?
Maybe a few of us. Not enough of us. Everyone else was going places, it seemed, and either we started going places, too— especially to those places that weren’t places (another word they’d redefined) but were just pictures or documents or videos or boxes on screens where strangers conversed by typing—or else we’d be nowhere (a location once known as “here”) doing nothing (an activity formerly labeled “living”). What a waste this would be. What a waste of our new freedom. Our freedom to stay busy at all hours, at the task—and then the many tasks, and ultimately the multitask—of trying to be free. It isn’t working, it never has worked.
The scientists know this too, and they think they know why. Through a variety of experiments, many using functional magnetic resonance imaging to measure brain activity, they’ve torn the mask off multitasking and revealed its true face, which is blank and pale and drawn. Multitasking messes with the brain in several ways. At the most basic level, the mental balancing acts that it requires—the constant switching and pivoting—energize regions of the brain that specialize in visual processing and physical coordination and simultaneously appear to shortchange some of the higher areas related to memory and learning. We concentrate on the act of concentration at the expense of whatever it is that we’re supposed to be concentrating on.
What does this mean in practice? Consider a recent experiment at UCLA, where researchers asked a group of 20- somethings to sort index cards in two trials, once in silence and once while simultaneously listening for specific tones in a series of randomly presented sounds. The subjects’ brains coped with the additional task by shifting responsibility from the hippocampus—which stores and recalls information—to the striatum, which takes care of rote, repetitive activities. Thanks to this switch, the subjects managed to sort the cards just as well with the musical distraction— but they had a much harder time remembering what, exactly, they’d been sorting once the experiment was over.
Even worse, certain studies find that multitasking boosts the level of stress-related hormones such as cortisol and adrenaline and wears down our systems through biochemical friction, prematurely aging us. In the short term, the confusion, fatigue, and chaos merely hamper our ability to focus and analyze, but in the long term, they may cause it to atrophy.
The next generation, presumably, is the hardest-hit. They’re the ones way out there on the cutting edge of the multitasking revolution, texting and instant messaging each other while they download music to their iPod and update their Facebook page and complete a homework assignment and keep an eye on the episode of The Hills flickering on a nearby television. (A recent study from the Kaiser Family Foundation found that 53 percent of students in grades seven through 12 report consuming some other form of media while watching television; 58 percent multitask while reading; 62 percent while using the computer; and 63 percent while listening to music. “I get bored if it’s not all going at once,” said a 17-year-old quoted in the study.)
They’re the ones whose still-maturing brains are being shaped to process information rather than understand or even remember it. This is the great irony of multitasking— that its overall goal, getting more done in less time, turns out to be chimerical. In reality, multitasking slows our thinking. It forces us to chop competing tasks into pieces, set them in different piles, then hunt for the pile we’re interested in, pick up its pieces, review the rules for putting the pieces back together, and then attempt to do so, often quite awkwardly. (Fact: A brain attempting to perform two tasks simultaneously will, because of all the back-and-forth stress, exhibit a substantial lag in information processing.)
Productive? Efficient? More like running up and down a beach repairing a row of sand castles as the tide comes rolling in and the rain comes pouring down. Multitasking, a definition: “The attempt by human beings to operate like computers, often done with the assistance of computers.” It begins by giving us more tasks to do, making each task harder to do, and dimming the mental powers required to do them. It finishes by making us forget exactly how on earth we did them (assuming we didn’t give up, or “multiquit”), which makes them harder to do again.
After the near-fatal consequences of my 2003 decision to buy a phone with a feature I didn’t need, life went on—and rather rapidly, since multitasking eats up time in the name of saving time, rushing you through your two-year contract cycle and returning you to the company store with a suspicion that you didn’t accomplish all you hoped to after your last optimistic, euphoric visit.
“Which of the ones that offer rebates don’t have cameras in them?”
“The decent models all do. The best ones now have video capabilities. You can shoot little movies.” I
wanted to ask, Of what? Oncoming barbed wire? I shook my head. I was turning down whiz-bang features for the first time. “I’ll take the fat little free one,” I told the salesman.
“The thing’s inert. It does nothing. It’s a pet rock.”
I informed him that I was old enough to have actually owned a pet rock once and that I missed it.
From a longer essay that appears in November’s The Atlantic Monthly. © 2007 by The Atlantic Monthly Group. Distributed by Tribune Media Services.
Thursday, October 25, 2007
Willow Creek looks at what hasn't worked
The following blog, Out of Ur can be found here.
October 18, 2007
Willow Creek Repents?
Why the most influential church in America now says "We made a mistake."
Few would disagree that Willow Creek Community Church has been one of the most influential churches in America over the last thirty years. Willow, through its association, has promoted a vision of church that is big, programmatic, and comprehensive. This vision has been heavily influenced by the methods of secular business. James Twitchell, in his new book Shopping for God, reports that outside Bill Hybels’ office hangs a poster that says: “What is our business? Who is our customer? What does the customer consider value?” Directly or indirectly, this philosophy of ministry—church should be a big box with programs for people at every level of spiritual maturity to consume and engage—has impacted every evangelical church in the country.
So what happens when leaders of Willow Creek stand up and say, “We made a mistake”?
Not long ago Willow released its findings from a multiple year qualitative study of its ministry. Basically, they wanted to know what programs and activities of the church were actually helping people mature spiritually and which were not. The results were published in a book, Reveal: Where Are You?, co-authored by Greg Hawkins, executive pastor of Willow Creek. Hybels called the findings “earth shaking,” “ground breaking,” and “mind blowing.”
If you’d like to get a synopsis of the research you can watch a video with Greg Hawkins here. And Bill Hybels’ reactions, recorded at last summer’s Leadership Summit, can be seen here. Both videos are worth watching in their entirety, but below are few highlights.
In the Hawkins’ video he says, “Participation is a big deal. We believe the more people participating in these sets of activities, with higher levels of frequency, it will produce disciples of Christ.” This has been Willow’s philosophy of ministry in a nutshell. The church creates programs/activities. People participate in these activities. The outcome is spiritual maturity. In a moment of stinging honesty Hawkins says, “I know it might sound crazy but that’s how we do it in churches. We measure levels of participation.”
Having put all of their eggs into the program-driven church basket you can understand their shock when the research revealed that “Increasing levels of participation in these sets of activities does NOT predict whether someone’s becoming more of a disciple of Christ. It does NOT predict whether they love God more or they love people more.”
Speaking at the Leadership Summit, Hybels summarized the findings this way:
Some of the stuff that we have put millions of dollars into thinking it would really help our people grow and develop spiritually, when the data actually came back it wasn’t helping people that much. Other things that we didn’t put that much money into and didn’t put much staff against is stuff our people are crying out for.
Having spent thirty years creating and promoting a multi-million dollar organization driven by programs and measuring participation, and convincing other church leaders to do the same, you can see why Hybels called this research “the wake up call” of his adult life.
Hybels confesses:
We made a mistake. What we should have done when people crossed the line of faith and become Christians, we should have started telling people and teaching people that they have to take responsibility to become ‘self feeders.’ We should have gotten people, taught people, how to read their bible between service, how to do the spiritual practices much more aggressively on their own.
In other words, spiritual growth doesn’t happen best by becoming dependent on elaborate church programs but through the age old spiritual practices of prayer, bible reading, and relationships. And, ironically, these basic disciplines do not require multi-million dollar facilities and hundreds of staff to manage.
Does this mark the end of Willow’s thirty years of influence over the American church? Not according to Hawkins:
Our dream is that we fundamentally change the way we do church. That we take out a clean sheet of paper and we rethink all of our old assumptions. Replace it with new insights. Insights that are informed by research and rooted in Scripture. Our dream is really to discover what God is doing and how he’s asking us to transform this planet.
Posted by UrL on October 18, 2007
October 18, 2007
Willow Creek Repents?
Why the most influential church in America now says "We made a mistake."
Few would disagree that Willow Creek Community Church has been one of the most influential churches in America over the last thirty years. Willow, through its association, has promoted a vision of church that is big, programmatic, and comprehensive. This vision has been heavily influenced by the methods of secular business. James Twitchell, in his new book Shopping for God, reports that outside Bill Hybels’ office hangs a poster that says: “What is our business? Who is our customer? What does the customer consider value?” Directly or indirectly, this philosophy of ministry—church should be a big box with programs for people at every level of spiritual maturity to consume and engage—has impacted every evangelical church in the country.
So what happens when leaders of Willow Creek stand up and say, “We made a mistake”?
Not long ago Willow released its findings from a multiple year qualitative study of its ministry. Basically, they wanted to know what programs and activities of the church were actually helping people mature spiritually and which were not. The results were published in a book, Reveal: Where Are You?, co-authored by Greg Hawkins, executive pastor of Willow Creek. Hybels called the findings “earth shaking,” “ground breaking,” and “mind blowing.”
If you’d like to get a synopsis of the research you can watch a video with Greg Hawkins here. And Bill Hybels’ reactions, recorded at last summer’s Leadership Summit, can be seen here. Both videos are worth watching in their entirety, but below are few highlights.
In the Hawkins’ video he says, “Participation is a big deal. We believe the more people participating in these sets of activities, with higher levels of frequency, it will produce disciples of Christ.” This has been Willow’s philosophy of ministry in a nutshell. The church creates programs/activities. People participate in these activities. The outcome is spiritual maturity. In a moment of stinging honesty Hawkins says, “I know it might sound crazy but that’s how we do it in churches. We measure levels of participation.”
Having put all of their eggs into the program-driven church basket you can understand their shock when the research revealed that “Increasing levels of participation in these sets of activities does NOT predict whether someone’s becoming more of a disciple of Christ. It does NOT predict whether they love God more or they love people more.”
Speaking at the Leadership Summit, Hybels summarized the findings this way:
Some of the stuff that we have put millions of dollars into thinking it would really help our people grow and develop spiritually, when the data actually came back it wasn’t helping people that much. Other things that we didn’t put that much money into and didn’t put much staff against is stuff our people are crying out for.
Having spent thirty years creating and promoting a multi-million dollar organization driven by programs and measuring participation, and convincing other church leaders to do the same, you can see why Hybels called this research “the wake up call” of his adult life.
Hybels confesses:
We made a mistake. What we should have done when people crossed the line of faith and become Christians, we should have started telling people and teaching people that they have to take responsibility to become ‘self feeders.’ We should have gotten people, taught people, how to read their bible between service, how to do the spiritual practices much more aggressively on their own.
In other words, spiritual growth doesn’t happen best by becoming dependent on elaborate church programs but through the age old spiritual practices of prayer, bible reading, and relationships. And, ironically, these basic disciplines do not require multi-million dollar facilities and hundreds of staff to manage.
Does this mark the end of Willow’s thirty years of influence over the American church? Not according to Hawkins:
Our dream is that we fundamentally change the way we do church. That we take out a clean sheet of paper and we rethink all of our old assumptions. Replace it with new insights. Insights that are informed by research and rooted in Scripture. Our dream is really to discover what God is doing and how he’s asking us to transform this planet.
Posted by UrL on October 18, 2007
Tuesday, October 16, 2007
I'm entitled...
I read an article lately, (I will apologize up front that I don't remember which magazine it was in and I'm not going to track it down...if I do, I won't ever finish this thought) and I've been thinking about it since. I'm not thinking about the subject of the article, which was about the rise of schools limiting the number of college referrals they will write for a student. No, what struck me was the response from the parents quoted in the article. Never mind that some students were applying to dozens of schools, or that the advisers/teachers were spending hours outside of work to complete these forms for the students. That didn't matter to the parents of little Johnny or Sue. (Or more likely, Little Jared and Savannah.) They bristled at the idea that the schools would impose limits on their child's "choices and dreams." The schools who were quoted weren't stripping the procedure down to the bare bones. The schools were talking about limiting th applications to 14 in one case and I believe 20 in another. That doesn't sound like the big, bad principal is stomping all over Junior's life plans. Studies have shown that the shotgun approach to admissions isn't very helpful, anyway. (U.S. News and World Report had articles on this earlier this year.) It's much better to streamline your applications to a half dozen schools, a few of those being safe schools- schools you are sure to get into.
Anyway, back to the parents', and probably students, attitude. There is the prevailing attitude in our culture that "choice" is God. We can't do anyting to limit anyone's choices in life. That seems to be the primary sin of our society. Don't tie me down, don't fence me in and whatever you do, don't limit my choices. Never mind that more choices don't make us happier. Never mind that more choices actually cause greater stress and lower contentment with our eventual decision. Never mind that great choice for me may mean major inconvenience, or worse, for someone else. As long as you don't limit me, I'm supposedly happy. (Read The Paradox of Choice for more information on how more choices don't make things better.)
I was so disappointed to see these quotes by the parents wanting people to work overtime so their child can apply to dozens of schools. What happened to teaching our children to be considerate of others? What happened to self control, putting others before ourselves? And what happened to growing up? The reality is, it's not all about YOU, child. These parents are fostering entitlement mentality in a major way.
This mentality pervades relationships, work situations, marriages, faith communities and more. Don't tell me what to do. Don't limit my choices. Don't make me decide. And if I do decide, then I want all decisions to be reversible. Don't even hint that there is anything permanent here. Permanence limits my choices.
I'm entitled.
Anyway, back to the parents', and probably students, attitude. There is the prevailing attitude in our culture that "choice" is God. We can't do anyting to limit anyone's choices in life. That seems to be the primary sin of our society. Don't tie me down, don't fence me in and whatever you do, don't limit my choices. Never mind that more choices don't make us happier. Never mind that more choices actually cause greater stress and lower contentment with our eventual decision. Never mind that great choice for me may mean major inconvenience, or worse, for someone else. As long as you don't limit me, I'm supposedly happy. (Read The Paradox of Choice for more information on how more choices don't make things better.)
I was so disappointed to see these quotes by the parents wanting people to work overtime so their child can apply to dozens of schools. What happened to teaching our children to be considerate of others? What happened to self control, putting others before ourselves? And what happened to growing up? The reality is, it's not all about YOU, child. These parents are fostering entitlement mentality in a major way.
This mentality pervades relationships, work situations, marriages, faith communities and more. Don't tell me what to do. Don't limit my choices. Don't make me decide. And if I do decide, then I want all decisions to be reversible. Don't even hint that there is anything permanent here. Permanence limits my choices.
I'm entitled.
Thursday, October 11, 2007
Just not in the mood...
My life hasn't gotten any busier in the last week, but I haven't been spending my free time on the computer. I seem to go in seasons when it comes to my leisure activities. I'll do nothing but read mysteries for weeks on end then not pick one up for several months. Or I'll be so active on the forums I participate in that it takes every free moment for weeks. Lately, I've been reading blogs for hours a day and trying to keep up with forums and email as well. But, like a light bulb, the desire to do anything on the computer just shut off last Friday and hasn't really come back on. I didn't even realize until Sunday afternoon that I hadn't even turned my computer on since Friday evening. I didn't choose to stay off the computer, I didn't even think about it. That's unusual.
I think I'll just go with the feeling. I'm sure in a few days I'll be ready to jump back into the fray, reading and thinking about "the issues" again. But right now, I'm going to read my book, watch TV, read to my daughter, and generally enjoy (finally!!) the fall-like weather we're having!
I think I'll just go with the feeling. I'm sure in a few days I'll be ready to jump back into the fray, reading and thinking about "the issues" again. But right now, I'm going to read my book, watch TV, read to my daughter, and generally enjoy (finally!!) the fall-like weather we're having!
Monday, October 8, 2007
What is a Charity?
Here is an article by Robert Reich of the Los Angeles Times about charitable contributions. I was skeptical at first when I just read excerpts in The Week magazine (great weekly mag, btw), but I thought the original piece worth reading. It's not overly long. What do you think?
Is Harvard a charity?
Most donations go to institutions that serve the rich; they shouldn't be fully tax-deductible.
By Robert B. Reich
October 1, 2007
This year's charitable donations are expected to total more than $200 billion, a record. But a big portion of this impressive sum -- especially from the wealthy, who have the most to donate -- is going to culture palaces: to the operas, art museums, symphonies and theaters where the wealthy spend much of their leisure time. It's also being donated to the universities they attended and expect their children to attend, perhaps with the added inducement of knowing that these schools often practice a kind of affirmative action for "legacies."
I'm all in favor of supporting the arts and our universities, but let's face it: These aren't really charitable contributions. They're often investments in the lifestyles the wealthy already enjoy and want their children to have too. They're also investments in prestige -- especially if they result in the family name being engraved on the new wing of an art museum or symphony hall.
It's their business how they donate their money, of course. But not entirely. Charitable donations to just about any not-for-profit are deductible from income taxes. This year, for instance, the U.S. Treasury will be receiving about $40 billion less than it would if the tax code didn't allow for charitable deductions. (That's about the same amount the government now spends on Temporary Assistance for Needy Families, which is what remains of welfare.) Like all tax deductions, this gap has to be filled by other tax revenues or by spending cuts, or else it just adds to the deficit.
I see why a contribution to, say, the Salvation Army should be eligible for a charitable deduction. It helps the poor. But why, exactly, should a contribution to the already extraordinarily wealthy Guggenheim Museum or to Harvard University (which already has an endowment of more than $30 billion)?
Awhile ago, New York's Lincoln Center had a gala supported by the charitable contributions of hedge-fund industry leaders, some of whom take home $1 billion a year. I may be missing something, but this doesn't strike me as charity. Poor New Yorkers rarely attend concerts at Lincoln Center.
It turns out that only an estimated 10% of all charitable deductions are directed at the poor. So here's a modest proposal. At a time when the number of needy continues to rise, when government doesn't have the money to do what's necessary for them and when America's very rich are richer than ever, we should revise the tax code: Focus the charitable deduction on real charities.
If the donation goes to an institution or agency set up to help the poor, the donor gets a full deduction. If the donation goes somewhere else -- to an art palace, a university, a symphony or any other nonprofit -- the donor gets to deduct only half of the contribution.
Robert B. Reich, author of "Supercapitalism: The Transformation of Business, Democracy, and Everyday Life," was secretary of Labor under President Clinton.
Is Harvard a charity?
Most donations go to institutions that serve the rich; they shouldn't be fully tax-deductible.
By Robert B. Reich
October 1, 2007
This year's charitable donations are expected to total more than $200 billion, a record. But a big portion of this impressive sum -- especially from the wealthy, who have the most to donate -- is going to culture palaces: to the operas, art museums, symphonies and theaters where the wealthy spend much of their leisure time. It's also being donated to the universities they attended and expect their children to attend, perhaps with the added inducement of knowing that these schools often practice a kind of affirmative action for "legacies."
I'm all in favor of supporting the arts and our universities, but let's face it: These aren't really charitable contributions. They're often investments in the lifestyles the wealthy already enjoy and want their children to have too. They're also investments in prestige -- especially if they result in the family name being engraved on the new wing of an art museum or symphony hall.
It's their business how they donate their money, of course. But not entirely. Charitable donations to just about any not-for-profit are deductible from income taxes. This year, for instance, the U.S. Treasury will be receiving about $40 billion less than it would if the tax code didn't allow for charitable deductions. (That's about the same amount the government now spends on Temporary Assistance for Needy Families, which is what remains of welfare.) Like all tax deductions, this gap has to be filled by other tax revenues or by spending cuts, or else it just adds to the deficit.
I see why a contribution to, say, the Salvation Army should be eligible for a charitable deduction. It helps the poor. But why, exactly, should a contribution to the already extraordinarily wealthy Guggenheim Museum or to Harvard University (which already has an endowment of more than $30 billion)?
Awhile ago, New York's Lincoln Center had a gala supported by the charitable contributions of hedge-fund industry leaders, some of whom take home $1 billion a year. I may be missing something, but this doesn't strike me as charity. Poor New Yorkers rarely attend concerts at Lincoln Center.
It turns out that only an estimated 10% of all charitable deductions are directed at the poor. So here's a modest proposal. At a time when the number of needy continues to rise, when government doesn't have the money to do what's necessary for them and when America's very rich are richer than ever, we should revise the tax code: Focus the charitable deduction on real charities.
If the donation goes to an institution or agency set up to help the poor, the donor gets a full deduction. If the donation goes somewhere else -- to an art palace, a university, a symphony or any other nonprofit -- the donor gets to deduct only half of the contribution.
Robert B. Reich, author of "Supercapitalism: The Transformation of Business, Democracy, and Everyday Life," was secretary of Labor under President Clinton.
Wednesday, October 3, 2007
A little insight
My 11th grade son came home with his mid-marking period progress report last Friday. This is his first year in public school after a life of homeschooling, so Will and I were very excited to see how Thomas was doing. His teachers all marked "doing well," and there were only two comments. His English teacher wishes he'd participate more. If she means she wishes he would talk in class, good luck. His Algebra 2 teacher wrote his average: 89%. Honestly, we're thrilled. We didn't know what to expect in math since he's hated it so much the last few years. I happened to mention that it would drive me crazy to be so close to an A and not get it. His response? "I'm glad I'm not you." I had to laugh. He really, honestly, truly doesn't care if he gets 89% rather than 90%, and he won't try any harder just to go for the A. Don't get me wrong, he's not goofing off, and I'm thankful. It's just that he's not exactly busting his tail, either. It's not just that grades don't mean anything to him, it's that he only wants to do what he needs to to get by. It's a mindset.
I can't help but wonder where this mindset came from. Is it nature or nurture? Or both? Did homeschooling contribute to it along with a natural tendency towards complacency? Would putting him in school have sparked a competitive streak?
I was reflecting on those questions this evening and it caused me to think back over my years in school. And I realized something. I was fiercely competitive in many ways, but never really in school. In public school I was often competitive with specific people. In other words, I didn't so much care about my grades as I cared about beating one or two specific people. In college, when the competition seemed more anonymous, my grades actually fell. I only received high marks in "cake" courses (usually courses where my natural gregariousness was an asset, like philosophy) or in my major classes. I got good grades in my major classes because I was engrossed in the material. I loved it so I learned it. But English, chemistry, and math? B's and C's were fine with me.
Thomas doesn't appear to be competitive about much outside of video games. But within the video games, he is very competitive and he takes it very seriously. Someday, perhaps, Thomas will find something else in life he is as interested in as video games. Or else he'll find a way to translate his video gaming into a life.
So now I'm not so worried about his complacency over a B in Algebra 2. He cares enough to do his homework promptly and go to class prepared. That's a huge improvement over the past few years. I have a feeling that this will be a trend.
I can't help but wonder where this mindset came from. Is it nature or nurture? Or both? Did homeschooling contribute to it along with a natural tendency towards complacency? Would putting him in school have sparked a competitive streak?
I was reflecting on those questions this evening and it caused me to think back over my years in school. And I realized something. I was fiercely competitive in many ways, but never really in school. In public school I was often competitive with specific people. In other words, I didn't so much care about my grades as I cared about beating one or two specific people. In college, when the competition seemed more anonymous, my grades actually fell. I only received high marks in "cake" courses (usually courses where my natural gregariousness was an asset, like philosophy) or in my major classes. I got good grades in my major classes because I was engrossed in the material. I loved it so I learned it. But English, chemistry, and math? B's and C's were fine with me.
Thomas doesn't appear to be competitive about much outside of video games. But within the video games, he is very competitive and he takes it very seriously. Someday, perhaps, Thomas will find something else in life he is as interested in as video games. Or else he'll find a way to translate his video gaming into a life.
So now I'm not so worried about his complacency over a B in Algebra 2. He cares enough to do his homework promptly and go to class prepared. That's a huge improvement over the past few years. I have a feeling that this will be a trend.
Subscribe to:
Posts (Atom)