In 1955 while addressing the National Academy of Sciences Richard Feynman stated "Scientific knowledge is a body of statements of varying degrees of certainty." As usual, Feynman's statement was spot on, and holds true decades later. In his famous "Plenty of Room at the Bottom" lecture Feynman talked about what we now call nanotechnology, and all the different applications. Here I am, half a century later, working "at the bottom" and living in a world of uncertainty. I hope to share some of the exciting discoveries at the nanoscale and explain how they apply to my passion of biotechnology- as well as the everyday world. Learn more about Nicholas Fahrenkopf
My posts are presented as opinion and commentary and do not represent the views of LabSpaces Productions, LLC, my employer, or my educational institution.
Please wait while my tweets load
Standard deviation. Error bars. Significance. Confidence interval. No matter what you call it, or how you calculate it, science is about more than numerical results. It’s about context. What do those numbers MEAN? (Statistics pun intended.)
I may have mentioned before, but my college holds a weekly seminar where all students attend and take turns presenting their research. My college is very interdisciplinary: we have electrical engineers, mechanical engineers, physicists, materials scientists, chemists, chemical engineers, environmental scientists, epidemiologists, industrial hygienists, biologists, biological engineers, mathematicians, computer scientists, etc. I think the only hard sciences we're missing are geology and astronomy! So, as you might imagine these different fields have vastly different traditions.
As a physics undergraduate, for example, we would approximate a horse as a sphere. That’s a silly example, sure, but there are many times when there are misunderstandings between fields. “Annealing” to a molecular biologist is the hybridization of two strands of DNA. To a materials scientist is mean heating your sample to (usually) 100’s of ºC to rearrange the crystal structure. CV to an electrochemist is cyclic voltammetry, but to an electrical engineer is Capacitance-Voltage. CVD can be cardio-vascular disease or chemical vapor deposition. These can easily be explained away, but the different ways that these fields deal with error, is much more difficult.
When I took advanced physics lab as an undergraduate we learned about error propogation. There is an inherent error in your measurement (how accurate can you measure a value?) There is also a variation (how consistent is that measured value?) So maybe you can measure a meter stick to +/- 1mm that is one error. But also maybe every time you make a meter stick it is off by +/- 2mm. There are both errors you need to take into account as you calculate from these measurements and report your results. Makes sense to me.
But then I moved into this college with micro/molecular biologists, and electrical engineers. My research is at the intersection of the two and they both handle their experimentation very differently.
Let’s start with a biology example. Say you want to find out if cells will move towards a chemo-attractant source. You would plate some cells, introduce the source, and track how the cells move. You would track many cells, and come up with an average. You would compare this average directional movement to a plate of cells with no source and hopefully the directionality would be significantly different (based on a statistical analysis). You would also likely repeat these 5 times to make sure it isn’t a fluke. You can then say with some confidence if the cells prefer to move towards the source. To an electrical engineer though, you’re watching blobs move around on the screen!
An EE (at least in our college) is usually focused on making devices. For me, I’m in a biology lab trying to make solid state DNA sensors. I get frustrated when I can make a sensor that works, but then the next one doesn’t. For a biologist that’s not acceptable. There should be consistency (like the 5 replicates in the above example). However, an electrical engineer friend of mine quipped: “in our field, if it works once, you publish it!”
And that was really the root of the discussion last week. Biologists (and a lot of scientists) do NOT like that concept. The want some amount of error. The EE’s said they don’t have any. And that’s where I disagreed. While they don’t have the reproducibility of the biology experiment above, they do know their measurement error. They can measure the resistance values, for example, to pico or femto accuracy. That doesn’t faze them and goes without saying. My argument is that this DOES matter. That means something to a biologist. Of course, they will always prefer to see a lot of replicates, but if you can explain that for that ONE experiment, you are extremely confident about what you observed/measured, that means something.
Going back to my advanced physics lab, for a biologist it is hard to take into account measurement error, so we can counterbalance that with a ton of replicates. For an electrical engineer it is hard to get many replicates, but the measurement error is so small we can get significance that way. I don’t think either field is right or wrong, it is just what they’re used to and we need to be able to explain and justify those differences.
What do you think? How is error treated in your field? Love it or hate it?
This post has been viewed: 4780 time(s)
It seems weird that an EE wouldn't care about being able to replicate the ability of a sensor to perform its function. At some point, you're going to want to use the technology to produce a piece of equipment that performs reliably.
I agree, it is strange. It really depends on where the person comes from. Someone from the sensor field would be concerned about that, but someone from an EE background that just wants to show that they came up with a new sensing method only needs it to work once. That's probably part of the reason you see papers on "novel" sensing technologies all the time, but never anything commercialized.
Since you're working on semiconductor sequencing, what do you think of Oxford Nanopore's MinION? Do you think it's vaporware? They presented some interesting "data" last February but haven't released anything since. The minIONs were supposed to start shipping the second half of 2012.
The difference is not in the fields of study, but rather in the two different types of work being done. In the example, the EE is making an new device, -- i.e. developing a new type of technology. Meanwhile, the biologist is trying to uncover a law of nature. The methodologies used are quite different, but that has less to do with the field, than the difference between "creation" and "discovery". If the biologist were developing a technique to clone mamals, then it would only take 1 success to publish (e.g. Dolly the sheep). If the electrical engineer were researching a new electrical theory (think back to when Ohm developed Ohm's law), then he would use a more statistical approach.
Sorry I didn't see this sooner. I hear some buzz about Oxford every once and a while but they still seem to be in R&D to me. I'll admit I haven't looked at it closely, but I see a lot more success from Life Tech's IonTorrent platform. Oxford also seems more complicated than what IBM is developing, but until people get their hands on the tool it is all speculation.
That's an interesting way to look at it, and I see what you're saying. I think that's a big part of it, but I don't think that's the whole picture. I'd still be skeptical of Dolly until it has been done multiple times, and I'm not sure EEs always use statistics before proposing their theories. But, I do think you have a really good point. And maybe it goes back to scientist versus engineer.