I have often thought about what it might have been like to have lived through pivotal years when scientific thinking or practice was undergoing a period of intense change. To have been a scientist in 1859 when Darwin’s Origin of Species was first published, say, and to find your frame of references turned completely upside-down, picked apart and stitched back together in some velvet-lined Victorian salon. Or to have been involved in the birth of molecular biology in the 1940s and 50s, seeing the light in a petri plate full of phage plaques and telling the world about your findings on a dusty chalkboard at Cold Spring Harbor or the Institute Pasteur.
Those were the days…
Although I must have always known it intuitively, it only dawned on me a few days ago that modern scientific progress is always in the midst of change. It just occurs so gradually that we realize it only in retrospect. Looking back on my career now, I recall a few moments of fumbling in the dark, working on things before the truth was revealed. As an undergraduate back in the 1980s, for example, I did a summer stint at the National Institutes of Health, trying to understand why certain strains of human papillomavirus could transform cervical epithelial cells. We knew it was down to the viral proteins E6 and E7, but nobody had a clue why. And I have a distinct memory of feeling almost overwhelmed by a black universe of ignorance – it didn’t matter how many plates of cells I forced through the FACS machine or peered at down the microscope: we would never really know what was going on. We had no purchase, no frame of reference: HPV’s transformational properties were practically magical – it might as well have been raw meat spontaneously generating into maggots.
And then, of course, people worked out how the viral proteins bound to the cellular tumor suppressors p53 and Rb a few years later, and it all turned into humdrum textbook material. And now, when we discover new transforming agents, we have an arsenal of reagents to fall back upon – ignorance isn’t a universe, just a temporary and easily-remedied set-back. Thinking objectively, I honestly don’t think it was my immaturity that made me feel so lost at sea; I genuinely think that when it comes to molecular cell biology, we now know enough to be within spitting distance, at any one time, of all known pathways or effectors. We may not have the complete Google-Earth view, but we certainly have a low-resolution roadmap of how the cell works. And it just wasn’t like that in the 1980s. Hey presto – I have lived the before-and-after of the recombinant DNA era. Post-genomics, ditto. Someday, two hundred years hence, some scientist may be thinking nostalgically about the years that passed me by without their significance even registering.
It’s not just knowledge that accrues unawares. Technology, too, ticks steadily onward while you aren’t looking. On Friday, I watched a colleague prepare to do a Western blot. But instead of assembling an SDS-PAGE gel from scratch, she reached for a ready-made version sealed in foil, shelf life approximately six months. Now, this stuff was available when I last did research four years ago, but it hadn’t yet gone mainstream, and we certainly didn’t have any in our lab. Now, nobody bothers making it themselves, and all I could think was, Thank god. Yes, I’m afraid that pouring acrylamide gels is not something I can say I missed during my editorial sabbatical.
Maybe the biggest change I have lived through, technologically speaking, was the transformation of DNA sequencing from do-it-yourself to outsourced. When I was a Ph.D. student in Seattle in the early 1990s, I wanted to understand how feline leukemia virus envelope genes mutated. To do this, I had to sequence the envelope gene (all 2100 base-pairs of it) of viruses I cloned from various infected cats and different time points, over and over and over again. For four years. Effectively, this meant that every day, I was pouring a big, thin acrylamide gel (remember how fiddly that was?), doing radioactive sequencing reactions via the dideoyl chain-termination method, electrophoresing the previous day’s reactions, developing the film from two days before and (worst of all) manually entering all the sequences into the computer for alignment. The G, C, A and T keys of my Mac SE were visibly more weathered than all the others, and in a few months I found I had memorized entire stretches of the wild-type FeLV envelope gene by heart.
At one point, my Ph.D. supervisor noted my passage through a momentous milestone: I, like her, was a member of the One Megabase Club. Did I pause to celebrate? No, I was too busy silanizing my plates and de-gassing my gel solution.
Of course, all this should inspire us to try to guess what routine technique that we all do today might be passé in ten years’ time, or what stubborn barriers in our knowledge will be knocked through. Stuck in the present tense, though, we can only wonder.
I’m still waiting for the special hat I can put on people’s heads which outputs their neuropsychological test scores on a range of measures. One day…
I think I saw that in an episode of the old Star Trek…
Jenny, what an interesting post. It reminds me of my PhD in which I had to subject my data to a lot of multivariate statistics. Just feed them into SPSS-X, press a button, and whoosh! — lots of lovely printouts in which everything was correlated with everything else.
I had fallen into the trap of doing all sorts of calculations simply because I could, rather than because they were necessary.
My supervisor recalled that when he was a lad, it took days to perform a simple analysis of variance… with a mechanical adding machine.
Such constraints force you to choose your problems carefully, and to devote plenty of time to that most fundamental question – what, precisely, is the null hypothesis you’re trying to test?
Henry, interesting. So you think that all the shortcuts make us intellectually lazy? This certainly sounds plausible, especially in our current ‘kit culture’. If you just add your sample to a kit and follow the instructions, you might not really know what reaction you’re doing (so, no need to know the theory behind the kit to make it work), or be able to adjust parameters if the yield wasn’t optimal.
But I am grateful that sequencing is automated now. There is no intellectual rigor in typing in 1 megabase’s worth of sequence by hand.
I think that’s rather an over-generalization. In an ideal world, labor-saving devices should help free your mind to concentrate on Higher Things.
Mmmm. Maybe. I still think it’s good to know exactly why you can put DNA into a tube, add some magic ingredients, run it through a filter and then come out with purified RNA on the other end. Otherwise we are training up a generation of scientifically illiterate people.
nods
I bet most of our students don’t know even how minipreps work (unless they’re my student of course. Mwah hah hah). This makes it ‘interesting’ when the kit doesn’t work as intended.
Precisely. You can’t troubleshoot when you don’t know what you’ve done.
Personally, I sort of miss phenol.
Heh. I am the lab’s Phenol Keeper. Ph34r me.
I just wanted to make sure nobody got the impression I’m anti-tech. I think it’s fantastic we can do things in a day that used to take a week, and things in a week that used ti take half a year. The only downsides I can see are – the above-mentioned not knowing what the advance is actually a short-cut of, but also…I sometimes get the impression that because experiments are easier to do now, we might be more likely to dash off on impulse rather than weighing whether the experiment is necessary, or whether another tack might work better.
Well yes. I mean, I never want to pour a sequencing gel ever again, but there’s something a little . . . worrying about a generation of scientists growing up where sequencing is something magical that happens in a building across town.
(and before you ask, I am the sort of person who isn’t happy with letting, say, an engineer loose on my car without knowing what’s going on. I’ll let her do it, but I want to know how it works.)
On the other hand, I use my computer and haven’t a clue how it works. And life is going on just fine. Perhaps, Richard, we are just getting old and kermudgenly? 🙂
Speak for yourself, Jen 🙂
You’re right there, though. Macs are magic. Fantastic.
I do wonder about your comment “we might be more likely to dash off on impulse rather than weighing whether the experiment is necessary”. I’m trying to think whether I’ve ever done, or felt, that. I guess so, actually: My previous boss was always telling us that ‘oligos are cheap’ so we should get them made even if we didn’t do the experiment in the end because it turned out to be unnecessary. For him, saving a few days was more important than saving a few quid. And yeah, I understand that.
That’s exactly what I mean. But I’ve seen entire experiments dashed off and then the results never analyzed because halfway through the person realized it wasn’t that interesting after all. A little thought beforehand would have prevented this waste of time and effort.