My posts are presented as opinion and commentary and do not represent the views of LabSpaces Productions, LLC, my employer, or my educational institution.
I am a scientist for profit. This means, as you are well aware, I have to work with marketing people to generate pretty pictures showing perfect results with any product that we sell. You know those flyers and brochures and ads in BioTechniques where a tiny picture of a gel or a qPCR assay with photoshop perfect curves or bands is plopped on the page next to some meaningless picture and supposed to convince you to call or go to a website? Those things.
Before working for a company, I would take a look at those pictures but I never put much stock into them. I mean, of course they're going to show perfect data. What else will they show? Their kit sucks next to a competitor? So marketing data never really did sway me much. I looked at it, but not in any depth. I guess, I expect there to be some attempt at science in the ad, but it's merely representative data.
My first biotech job wasn't in marketing. The company I worked for was and still is considered one of the best in the world and I was so very proud to be a part of that company. When they would introduce a new product, the product manager would come present all the beautiful R&D data proving the product works and it was convincing. I would walk away from those meetings absolutely positive that this was the best damn invention in the world and we have geniuses in R&D and how lucky am I to represent such brilliance.
About this first company, I still do believe that they have geniuses in R&D. However, since leaving, I feel that their employees are extremely self-obsessed and self-absorbed but I can understand why they are that way. It is part of the company culture. But that isn't the point of this article.
My experience in marketing as a product manager gave me a new outlook on how data is presented to the consumer. When your job hinges on whether or not a product makes money and your raise is measured against how much, suddenly those photoshop perfect images look pretty relevant plopped in the middle of that unimaginative, unimpressive, meaningless, boring ad.
In my current position, I decide what images are used and oversee the scientists performing this work, and performance ratings are completely unlinked to revenue. This means that what you see is what you get. And because we get to know our customers on a personal level, we don't want to risk putting out any piece of data that could be called into question. It has to be at a standard that meets the level of the customer. And if I am going to use outside data from a customer, it has to be equally as perfect. If it does not, then I am careful about how the results are interpreted so that I don't mislead customers (or sometimes I won't use it at all).
Recently a company that I am fond of tweeted a link to a "Tech Note" that they published on their website. I went to the article as it was a topic of personal interest. The first thing I noticed was that the article was data from a customer, not data generated internally. But it doesn't excuse the fact that the data was garbage. My first thought was, did anyone in marketing stop to think that maybe they should run these results by a scientist in R&D? Was not a single person with a scientific background involved in approving the piece? I would have been embarrassed to publish something that poor on one of my products. Any intelligent scientist would look at it and dismiss it immediately.
The report was a project using real-time PCR (qPCR) to quantify copy numbers of a microorganism. This company has specialized experience in qPCR. They know what a standard curve should look like. This standard curve was showing about 70% efficiency (which is well understood to be useless for quantification), and they reported none of the data needed to accurately interpret the results, including controls.
So now we get to the gist of my article, which is, do life science suppliers have to follow the same rules as scientists when reporting results, especially when the results are going into Application Notes or Technical Reports that the company is trying to pass off as real papers? I think they do. I think that the same standards for good scientific practice apply. And if you are going to publish customer data to promote your product, then it better be good. Don't print crap simply because it's not your in-house data. If you publish crap, it reflects on you. There really is no excuse for it. This company has an army of scientists they can consult. Believe me, if they had, they would have told marketing, "don't publish that because it makes our kit look bad". In this case, it is obvious that marketing has no clue how their products work, what customers look for, or what a good result is and just figured if a customer did it, it must be good.
Some of you who use qPCR in your work may have seen or heard about the guidelines that were created by a group of experts on qPCR to help fix the problem of shitty data getting published into journals. The article is called The MIQE Guidelines: Minimum Information for Publication of Quantitative Real-Time PCR Experiments.
These guidelines are for both reviewers of peer reviewed papers and for authors. The goal is to normalize what data is being reported in qPCR experiments so that the same high quality work is published and complete data is provided.
I can't tell you how many papers I've seen in my favorite over-rated journal where qPCR data is published and is missing all kinds of controls and information that make it impossible to actually believe the results. I read some of these articles wondering, how the hell did this get published?
In my company, because we do not publish in peer reviewed journals, we put together many of our own scientific reports in a journal like format for publication on the web and in print. We follow the MIQE guidelines for our qPCR work and we write our materials and methods so that someone reading it knows exactly what we did. We interpret the data carefully and do not make conclusions that are not supported by the data. We want to have credibility.
And when I use customer data, it has to be good quality or we don't use it. It has to be as complete as if it were going into a journal. It not only represents the product, but it represents my company. Apparently not all companies think that way.
I'd like to end with some questions for you. What is your opinion of marketing data? Does the marketing data on an ad or a brochure sway your opinion? What about technical reports/ application notes? If you do read them, do you scrutinize the science? Or am I just being too much of a perfectionist?
This post has been viewed: 6524 time(s)
Or we have some weird psychic connection.
So- your thoughts on the relevance/accuracy/influence of marketing data?
I'm lazy, here are the two emails I wrote:
I really do think it would be beneficial to get a biotech perspective in the science blogging community. The problem is that it's hard to walk that line of "Is this science" or "Is this biotech propoganda marketing." Not to knock LifeTech, but their latest Dynal bead video slamming nanobeads is exactly what science marketing shouldn't do. Your audience wants to be given facts that are supported by data and figures (preferably peer reviewed) not just pretty pictures and speculation. That's the approach biotech marketers are going to have to take to not be seen as just slimy wares hawkers.
A lot can be learned by the Pepsi-gate scandal, and being open and honest about your products without hyping them too much is what's going to garner the most support in this community. Your advertisements can be something completely different, but if you're trying to make a social connection with potential clients through the blogging community, you're going to have to adopt some new techniques. We're highly skeptical and unfortunately with anonymity people are much more willing to tell you their opinion (usually over blown and with profanity...). That can produce exactly the opposite reaction you're looking for.
The things that I'd be most interested in seeing from a biotech company are probably the least likely to be shared. It'd be awesome to see a biotech company have a blog series detailing the development of a product with data and discussion of things that worked and didn't work along with comparisons with other competing products. I realize this kind of thing would probably drive your legal departments nuts ;)
#2: Kristy asked why I might be interested in a product development blog.
I'm a data geek. Scientists are data geeks. Some may be more or less excited to see and understand the theory/experiments that went in to developing a product or group of products. I feel like if I know more about the product, I can trust that it does what it promises on the box. I remeber looking at an ECL ad when I was in grad school and the company said, "Our ECL is 10x better than Promega!" And it had this really impressive picture of a western blot. So I asked for a sample and did a side by side comparison, and what do you know? They're exactly the same! At that point I didn't even care if the new product was cheaper, their marketing tried to dupe me with crappy data, so I am a loyal SuperSignal fan for life!
Now, I understand biotech needs a hook to sell products and get eyes, but these kinds of things aren't helpful and don't build strong relationships. Having a blog exaplaining the development of the product and opening the walled garden may give you the edge with scientists, even if your product isn't a whole lot better than a competitor. The fact that the scientist knows and understands exactly what is in there gives them confidence in your product. For example, I really hate it when I run out of a qiagen buffer, call them up and say, "I ran out of buffer EX, what do I need to make some more to finish this experiment." And the rep responds with some cheesy line about how the recipes are proprietary and blah blah blah. Do you really think I'm going to screw your company by making my own buffers? It's not like you just sell columns! I want to finish my damn experiment, so just tell me how much salt is in the damn tris based buffer :P It's little things like this that make me think twice when I buy products. I think a blog talking about product development honestly would build trust and make me much more confident in my product choices. But that's just me. Maybe I'm a minority :)
I thought the Dynal thing was distasteful too.
We did a poster last year where we explained the development of a product. but we have to be careful also. Doing that saves our competitors months of R&D. We can't get patents on everything we do so we have to keep some things tradesecrets.
For example, if we spent a year figuring out how to stabilize RNA in a plant leaf or a seed (something no one has figured out how to do), we wouldn't tell the whole world how we did it otherwise a year of research will be for nothing. We could give some hints around it, but we couldn't reveal "when you mix 10% of chemical X with 2.5% of chemical Y, it allows for the RNase inhibitor Z to pass through the waxy cuticle of a plant leaf and through the cell wall to protect RNA and stop gene transcription."
If it were patented, then you could reveal more, but not tradesecrets. The qiagen kits are all tradesecrets and the plasmid midi/maxi anion-exchange kits are patented- which is why they give you all of those buffers.
I do get really annoyed at marketing that says how much better something is over a competitor and it really is just the same. It is better to say "It's as good as" the leading manufacturer in this case. Don't overpromise and under deliver.
I don't think they would get any scientist that wants to keep their job to write a product development blog. I don't even want people to know what we are working on much less how it works.
Jade, why couldn't the blog come out as a series AFTER the product has come out and is patented. Then you could have the product development history and release the whole story retroactively.
Well, it could, but most patents take 2-3 years to be granted. You can sell the product as "patent-pending" until then and your invention is publically available during this time as well.
But what if you don't get the patent? Then you'll have revealed all the effort and your competitors can come in and sell the exact same thing but with 1/10th the cost. They could even sell the product for less than you now, because their ROI is so much lower than yours.
What products or kits would be of greatest interest to you to hear about, if there is anything in particular that you would be interested in?
I think there's a way to present the data in a blog series that shows data without giving away essential secrets.
It may not satisfy the curiousity of the reader.
I won't volunteer to write that type of article while I have a job (and need to keep working). Maybe when I retire. But even then, I am bound by conidentiality agreements that last for years after leaving.
It might be cool to get inventors to write articles about products that are now off patent. Like anion-exchange in the qiagen plasmid kits, for example. Or the invention of TRIzol. The Taq patent is expired but everyone knows the story behind PCR. I think the best chance of getting articles about product development would be products where you can no longer be sued for infringment.
That's a pretty cool concept, writing about stuff that's no longer under patent.
I sat in on a MIQE talk and we'll be incorporating it into our own research. At what point do you consider the cutoff for a standard curve efficiency to be useful for quantification?
Yeah we don't really have a lab webpage and I'm not going to sit down and learn Java anytime soon. I'm a huge proponent of MIQE as a lot of qPCR papers leave out a ton of the details. In fact nothing pisses me off more than shoddy M&M sections.
Yes- it would be basically on the classic techniques. Now someone needs to interview Metin Colpan or Piotr Chomczynski or ask them if they want to write 1000 words on their inventions.
Probably be a better idea for a book. It could be on the greatest life science inventions of all time and include the entire story of how they were conceived, developed, and then sold and marketed.
My list would include:
DEAE Anion-exchange (plasmid resin)
RNALater (still on patent)
qPCR- hydrolysis probes
Antibody Hot Start PCR
80% is the absolute minimum effficiency but really, that's not going to have high accuracy. One cycle difference can be huge.
We aim for 90% efficiency for our work.
There are formulas that can take into account the efficiency so you can correct for it. I just don't see people using them in publications.
It happens that sometimes you really can't design a better assay- maybe the sequence is just too GC or AT rich to get high efficiency or you are trying to design in conserved sequences and are limited. I think as long as the person explains these limitations and corrects for the low efficiency, it is fine.
Some qPCR mixes are better than others too. We've seen big differences between master mix suppliers.
In general there are many ways to improve the efficiency so it may take some tweaking but it should be done if the replicates are not tight or the assay is not linear.
If you want to ask Greg Shipley (one of the authors who gives many lectures on the paper) about the database issues, you can email him directly or ask at this email: firstname.lastname@example.org
Greg answers questions on this Yahoo list very frequently. He is super friendly/helpful.
Jade, thanks. Doing environmental work a few of the probes I'm stuck with using have a high degree of degeneracy and I've noticed that as the degeneracy increases the efficiency drops (which isn't all that surprising). The lowest efficiency I had was 88%, and that primer set had one primer of the pair that had an internal stretch where half of the nucleotides were degenerate. Maybe we should have tweaked the assay a bit more to increase our efficiency and it's something to strongly consider for future studies.
88% is not bad- it's close enough to 90%. You could try some simple changes just to see if there is a difference (like the SYBR mix could give you a couple % boost). The protocol is a little different- 2 step cycling instead of 3, but it works.
Sometimes one of your dilutions is throwing it all off- such as the first or the last. You might want to take it out and see. If the first dilution comes up too soon (in the first 10 cycles) or the last dilution is coming up too far away from the one previous, take them out. It changes the dynamic range of the assay but you'll have accuracy.
Depending on the instrument you have, dropping the annealing temp or slightly longer annealing time might do it.
The size of the product has a big effect. Keep it below 250 bp.
The template DNA used to generate the standard curve could alter the results. If you use plasmid, linearize it. We check ours against genomic DNA. Some people like to make DNA oligos (if the template is 60-80 bp) and use them for efficiency checking.
I think it is ok to be in that 80-85% range as long as all the details are reported.
Jade, thanks for the tips.
1. We used a BioRad qPCR machine for this latest report. It's the last time. I believe the stuff coming off our Roche is better.
2. We are using plasmid DNA for our standard curve, so I'll have my support scientist linearize it from now on.
3. The 250bp max limit is tough for some of the work we're doing unfortunately. We definitely need new primer sets for some of these genes, but universal priming sites are difficult to find. I may have to move to class-specific sets.
In an age when the average person in North America is subjected to over 5000 ads daily, most of us have learned to filter them out or view them with a high degree of cynicism. At Kinexus Bioinformatics Corporation, we have tested in-house about 3500 antibodies from over 25 different vendors, and more than 80% of these commercial antibodies are either impotent (i.e. weak), non-specific ("dirty") or both. One would not get this impression from viewing the images of immunoblots with these antibodies in most print and on-line sales catalogues. Since there really are no page restrictions in cyberspace, more data on the performance of products in diverse situations on company websites would be most appreciated by potential customers. Unfortunately, some infamous vendors don't even perform basic testing of their antibodies and leave it up to their customers to determine whether the product performs. Regretfully, such practices results in a lot of wasted time, money and effort.
This is a great post Jade. In my field, I put little to no trust in marketing. I always try to talk to the R&D folks directly. Not that they will ever say anything bad about the product, but if you know what questions to ask you can figure out under what conditions they tested their stuff, and possibly get a hint to the real limitations. No matter what, when we get new stuff in, we run our own extensive testing.
This all happens unless it is a trusted vendor, w trusted reps, who know what they are doing. The personal relationships are VERY important, and I agree, you do not want to mess those up. The more trust worthy a rep is, the more realistic they are about the data, the stronger the bond, and the more certain they can be that they will ALWAYS be called upon for more stuff.
If I get any misinformation from a rep, if they say they know something when they don't, or they say something is for certain when it's not, we're done. I won't deal w them again, and probably not their company either. It's just not worth my time. So I very much agree w you on this.
Great post Jade.
We've posted an exclusive interview with Stephen Bustin, lead author of the 2008 MIQE paper published in Clinical Chemistry. Anyone interested in a "behind the scenes" look at MIQE should read An Interview with MIQE Guru and qPCR Expert Stephen Bustin
Here is the link to the plasmid paper:
There are some critiques- people reviewed it here, but I think that it's worth comparing the reproducibility and efficiency of linear vs. supercoiled and then see if it matters for your assay. If not, it may not be worth the extra steps.
You can go a little bigger with the amplicon, but efficiency goes down with increasing size because it becomes more difficult to get doubling in each cycle. And if you use Roche, you are probably cycling extremely fast- like 5 second extensions? If efficiency is good, that's all that matters.
Most companies will simply re-direct you to a landing page on their site with links to marketing brochures and data, so even then, it is still the questionally perfect images or competitive comparisons. In the end, you have to try it yourself anyway. That's where free samples are a nice so that people can not waste time and money. You can get a refund if the antibody doesn't work, can't you?
You make a really good point- that the sales people in a company can make all the difference. Maybe companies should invest less in full page ads and more in good people and teaching them how their products work so they can have a stronger relationship with their customers.
Yea, that would be fantastic.
There is nothing wrong w marketing in itself, but in recent decades, it seems that marketing in all fields has come in place of actual product value. The goal became to make pretty pictures, w pretty faces, shinny glittery eye candy, and get people to buy shit that doesn't really perform.. That's pretty much how we ended up w today's tv commercials.. half nekkid people supposedly representing shit that has nothing to do w the ad itself. It's not quite as bad in the Sci world, but it does seem to be on that path.
I would love to see just what you said happen, more education for the sales reps, more honesty about the products, and of course more money spent on R&D rather than on color ads on fancy paper that go right in the trash..
This got me to thinking ... I love ad's that show "Best Product Evar!" versus "Competitor A" and "Competitor B". Now, I have no idea if I'm using the competitor mentioned in A or B ... and the ad's won't state who they are either. For all I know I'm looking at enzymes sold bought from a company in some third world country that has the worst QA/QC in the world. I don't care about those ad's ... just show me what your product can do, send me a free sample, and let me do the comparisons on my own. Obviously, if I'm looking at your product I'm thinking of switching anyways, no need to show me an ad of a competition that I can't tell anything useful from.
Tom- I know- it's in the handbook of product marketing, apparently. Must show your product kicks the ass of the leading suppliers. Maybe it did one time. Or maybe it does but only when you use DH5alpha E.coli and they are in late stationary phase and your NaOH is less than a week old.
I think the main point of those figures is to give you confidence to sample it. If the company can't at least show it works as good as the most popular supplier, then you wouldn't bother to try it at all, no? Sometimes the other selling features, such as convenience or speed or price are not enough to convince people to try something.
Usually, it is easy to guess who the competitors are, since it is almost always the first letter of the company name (Q, P, B, I, A, E, etc.).
OK, how did I not see this post until now?! I used to be a product manager too, and I blogged about the experience here.
I escaped three years ago, and I still shudder at some of the things on the list I made!
Nice article there- thanks for the link. You might find the marketing dictionary interesting- can probably add some terms to it.
And you would laugh at some of the earlier articles I wrote talking about the insanity of marketing...It does take a certain mindset to be able to do that job for a long time.
my name is roberta pappaianni and I'm an Italian student.
I read your article and I've found it very interesting.
I'm a student and I'm writing my thesis and the argument is the "marketing in the pharma and biotech sector".
Do you think you can help me with some kind of articles or something like that?
thank you so much!
Hello Jade, thank you for this article, I just stumbled upon your blog this morning. Not only did I just start my job search and your articles will definitely help me out, but I also just came back from a seminar that talked about opportunities for science PhD’s in biotech marketing. This was something I definitely wasn’t originally interested in, but I am now considering. Although the seminar was aimed more at overall experiences working in marketing rather than specific work, such as product advertisement highlighted here, your viewpoints raise interesting points for me to consider. Thanks again!