Thursday, November 19, 2009
I have made a mistake, and perhaps led you all astray.
It is true that in the Axopatch, the signal ground is not connected to the power ground.
But, it turns out that it's very difficult to isolate power ground from signal ground, without essentially breaking every BNC connection between the amplifier and the digitizer. And when I did that, I got a good amount of hum pickup from those now unshielded wires. At times you could find a position where the pickup was minimal, but it wasn't easy. And figure that you've got at least 3 connections at a minimum (signal output, analog input, and gain), that all equals one big pain in the butt.
This advice originated when the digitizers were the old Digidata 1200. That digitizer is specifically designed to isolate the signal ground from the power ground (most important to isolate it from the computer, whose fast switching power supply is a huge noisemaker). That all changed in the 1300 line of Digidatas.
However, it does seem that the simple act of using the grounded to non grounded plug adapter on the amplifier, does dramatically reduce the RMS noise level. Why it exactly does this, when there's still a connection between signal ground and power ground, I am not clear on.
Nevertheless, I am available to come and lay my hands on your setups, to reduce the noise. Cause it seems I'm back to that level of rationality for this whole process.
Gotta bring this up with the big guy soon.
Wednesday, November 18, 2009
Tuesday, November 17, 2009
It was the best of rigs, it was the worst of rigs,It was the age of 0.07 pA RMS, it was the age of 0.5 pA RMS, ±5 pA p-p at 60 Hz,It was the epoch of separating signal ground from power ground, it was the epoch of connecting all power grounds to the signal ground,It was one really annoying run on sentence written by some old white dude, it was a kick ass blog post written by a middle aged white d00d.
- bent piece of metal that can be put right in front of headstage/chamber to further shield (not shown)
- Faraday cage
- air table surface
Thursday, October 29, 2009
Wednesday, October 28, 2009
Monday, August 31, 2009
Friday, August 7, 2009
Thursday, August 6, 2009
- Successfully (more or less) integrated a new family member into the finely tuned machine that was Family Blair (HA!). There are posts aplenty conceived to cover this, from the innate differences between kids, to balancing the demands of two kids with work/science. Sadly, balancing the demands of two kids with work/science means writing these posts doesn't happen. Still though, the productivity hit after the second kid was MUCH more minor than after the first.
- Finally published a good chunk of my work covering the modulation of a particular TRP channel. This project has been tortuous at times, yet through many difficulties, I stuck with it. The result is somewhat limited in its scope, but the treatment is thorough. Hopefully it will prove useful to those interested.
- Started collaborating with another group on a cool new project. Things seem to be progressing nicely, and it has been a lot of fun.
- Actually made some progress in finishing up my final paper from...gulp...my thesis work. Or as I like to refer to it, My Own Personal Albatross.
- Have the prospect of starting another collaboration, that would help me complete some older preliminary work.
- I have totally regained my calcium imaging mojo, after neglecting those long lost 1997-1999 era skills.
- Lastly, I finally came to the realization that it's no use to fight against the core of my own personal scientific style. Sure, some parts can (and should) be bent in response to outside considerations. But to fight against the core will lead only to madness.
- I haven't made sufficient progress on the other side of the story to the TRPC channel regulation. The conceptual framework is basically there, and I've got tons of planned expts, but little time to carry them, or the requisite troubleshooting, out.
- I didn't complete the manuscript of My Own Personal Albatross. Really though, I blame the daughter. I thought I had 6 more weeks!
- I haven't done enough to move from heterologous to endogenous systems for studying TRP channels (TRPCs in particular). There are a number of reasons this is tricky, but honestly, that potential trickiness has prevented me from really trying. That's got to change.
- I definitely haven't been able to get any sort of blogging routine down.
Monday, June 1, 2009
So, I present to you, the best cover. Ok, not the best, Jimi Hendrix's version of Dylan's "All Along the Watchtower" was taken, as was Johnny Cash's "Hurt." But this certainly is pretty cool. Let's set the scence for when you'd look to fire this up:
You're a parent, you've got a crabby/colicky/ornery child who just won't go to sleep despite drooping eyelids above black circles. Aha, a lullaby, that'll work. Now what to pick?
Need you ask? The answer is obvious: The lullaby renditions of Tool. Because there's nothing else I want my infant going to sleep to besides such sweet compositions as "Opiate" and "Schism." Here's hoping Volume II has Stinkfist. (Actually, I dig Tool, but c'mon.)
As for the worst cover, I have to go with Sixpence None the Richer's version of The La's "There She Goes". Why? Because other than the switch to a female lead vocal, this cover is essentially exactly the same as the original. That's a big fail in my book. Great covers reimagine the original in ways that the composer never saw, adding and extending it. Everytime I hear Sixpence's version I just think, "sheesh, I liked the version in 'So I Married an Axe Murderer' a helluva lot more."
Here it is if you must:
Tuesday, May 26, 2009
Sunday, May 10, 2009
Here's hoping you get a little extra time for yourselves to relax and enjoy the day. Here in the Blair house we're celebrating with a spinach and pepper frittata, bacon, home fries, and mimosas made with fresh squeezed orange juice. Break out the champagne flutes!!!
Then later we'll feast on the chocolate mousse cake. YUM!
Saturday, May 2, 2009
Here's to the next decade together!
Wednesday, April 29, 2009
In the first, the Press's Executive Editor, Mike Rossner, discusses the practice of bundling large numbers of journals by the mega scientific publishers, and the effects on university libraries. Unsurprisingly, the current economic climate is affecting not just newspapers (do you hear that Boston Globe? That...is the sound of inevitability), but will have big impacts on science publishing. And that doesn't even take into account moves towards Open Access. Check it out here.
Here's one very interesting tidbit from the editorial:
"The Rockefeller University library subscribes to bundles of online journals from several megapublishers. For one of the bundles, the top 10% of journals garner over 85% of the hits to the bundle from users at the University. Over 40% of the journals in the bundle had no hits at all from the University in 2008!"In the second editorial, from the May issue of JGP, Editor Edward Pugh takes on one of my personal hobby horses: Supplementary Data. Now in principle there's nothing wrong with Supplementary data; it's just currently there seem to be few standards about how they should be dealt with, both in review and archiving. Pugh clearly sets out at least JGP's position on them:
"Several pressures now call for a review of policy on Supplemental Material. One pressure comes from the growing use of such material by other journals as an omnibus substitute for publishing scientific material. Increasingly, methods, theory, and even primary results are offloaded to supplements. As a community, we need to question such practices, asking whether they are dictated by the goals of science or by financial expediency, and inquire as to the short- and long-term consequences of such practices for science."So go check that out too. Oh, and while you're there, check out a modest little paper by Blair, Kaczmarek and Clapham. All 14 figures of it that is!
Friday, April 24, 2009
In this case, there's a poll at the LA Times blogs (here), stemming from a recent pro-research rally helpd at UC in support of animal research. The results have tilted towards the anti research poll option, so for any of those folks out there who support the responsible use of animals in research, and realize that there can be little biomedical science without it, go and vote. And tell your labmates, friends, parents and grandparents to do the same.
And if you're interested in the things that impede meaningful debate about animal research, go check out Dr. Free-Ride's recent series on the topic!
Tuesday, April 14, 2009
This post is the final end of a big sigh that wraps up last week's Week of Not Very Much Fun.
-First, I spent the weekend before last worrying about whether I had left my bag (complete with laptop) inside our daycare center or outside in the parking garage. The first being probably ok, the second not so much. It's amazing the complete and utter lack of recollection 5 months without >4 hours of continuous sleep will do to a brain. Luckily, the bag was saved by one of the day care folks. So I didn't loose my 4 year old laptop with its broken hinge, on which I'm writing this very post to you good readers.
-Second, I was all ready to spend last week working on a data presentation for the lab on Thursday (which is fairly involved, given the size of the lab and long stretches of time between turns), as well as a 25 minute talk for the Neurobiology Dept. on Friday. But, fate intervened, as we got our paper finally accepted (good!!), but the editors asked us to turn the final changes around in 2 days (ok, doable, but starting to cut things close).
-The final straw was the girl coming down with an ear infection. She needed a couple days off daycare, which my wife and I split. She improved so quickly after the ped visit, and thank FSM for antibiotics. That basically killed the possibility of the lab data club, but I was able to get the final manuscript changes done and the talk prepared. Of course, with even less sleep than usual. I think this led to one assessment of my talk, which was "Clear, but needed more enthusiasm." Fair enough, but we're almost at the breaking point here people.
What I'm referring to are some myths we live by in the lab. One of cherished one among some people is that competition inside the lab improves productivity. That's complete bullcrap. Sure, there are lots of PIs whose management styles use it, but it's simply wrong. Apparently Candid Engineer's PI feels this way. Which sucks, even if it is all too common.
Unfortunately, what these jerky PIs ignore is the actual data that suggests that competition within groups hurts overall productivity. Teresa Amabile is a Harvard business school prof who has tracked the daily work of people in high tech, chemical, and consumer products industries, and the results run completely counter to many of the preconceived notions we have about creativity. An article in Fast Company discusses the 6 Myths of Creativity. All of them are good, but here's the money quote for this discussion:
I wonder how well this observation scales beyond individual lab groups to science as Science. How much competition is good, and when does it start to be detrimental? Certainly the last sentence here can be applied to Science.
5. Competition Beats Collaboration
There's a widespread belief, particularly in the finance and high-tech industries, that internal competition fosters innovation. In our surveys, we found that creativity takes a hit when people in a work group compete instead of collaborate. The most creative teams are those that have the confidence to share and debate ideas. But when people compete for recognition, they stop sharing information. And that's destructive because nobody in an organization has all of the information required to put all the pieces of the puzzle together.
Next, Bob Sutton is a Stanford B-school professor, who wrote a book called "The No Asshole Rule" (how great is that? Plus he has a kickass blog, which has been on the Googly Reader for some time). His recent post highlights another group's paper
...using quantitative analysis to uncover patterns across large numbers of studies -- in this case, 72 studies of nearly 5000 groups. The overall findings aren't a surprise, that groups that engage in more information sharing enjoy better performance, cohesion, knowledge integration, and satisfaction with decisions made
And if not, then do us all a favor and wear a goddamn button that says, "I'm the Michael Vick of pitting my trainees against one another." Then we'll all be fairly forewarned.
Wednesday, April 1, 2009
So, what is a liquid junction potential? Sure, maybe you could look in some of the Electrophysiology Bibles. Or maybe you could even hit up an electrochemistry textbook. But it's 2009, and you've got two things on your side: Google, and me. So forget that, and allow me to regale you with the story of the liquid junction potential:
And in this Gedanken experiment there was a pipette filled with your typical pseudo-intracellular solution: You know the drill, high potassium (light blue), low calcium, and an anion species that's usually not chloride. This anion could be something like methanesulfonate, gluconate, or my own personal favorite, aspartate. The main thing to note is that all of these are bigger than chloride, and bigger than potassium. Thus, they have a lower mobility, meaning they don't diffuse as quickly as the accompanying cation.
Figure 2*: The Gedaken imposed barrier is removed, and ions are diffusing down their electrochemical gradients. The bigger, slower aspartate can't keep up relative to the smaller, faster potassium, sodium and chlorides. They get left behind in the pipette, generating an excess of negative charge.
Note that a liquid junction potential would also occur if the bath solution has cations and anions with significantly different mobilities. It just turns out that sodium and chloride have pretty similar mobilities, so that their contribution to the liquid junction potential is much smaller. But if you have N-methyl-d-glucamine (NMDG) as the main cation in your pipette solution, you'll have an excess of positive charge in the pipette solution, and a corresponding slightly positive junction potential.
Next up, how to measure the liquid junction potential.
*-Note that these figures were created using Inkscape, a very cool and usable opensource vector
graphics drawing program (a la Illustrator). Check it out, download it, play around with it!
Tuesday, March 31, 2009
Let's just hope I don't die some rock and roll death. You know, the "Nat didn't show up in the lab for the big experiment, we found him having choked on vomit, but we don't know whose vomit it was, cause you can't dust for vomit." It's either that or a bizarre gardening accident, and it is spring time.
(Seriously though, I love science, right beneath the family, but sometimes it's a heartbreaker. Goddamned unrequited love.)
Friday, March 27, 2009
People are out there reading, so there must be people out there who care. Keep 'em coming folks, keep 'em coming!
By the way, what's with variability between the citation search engines? I know this has been covered before somewhere, but really.
Searching on my first paper from grad school, we get:
90 cites in ISI from the J Neurosci site.
93 from ISI itself
98 from Scopus (I got a free preview for reviewing a paper. It's pretty cool, but I haven't used it enough to say much substantive. I do like the display of citations by year, Journal, author).
102 from Google Scholar
Now, so of these I'd cut slack for (e.g., J Neurosci probably only pulls cite numbers from ISI infrequently, and who know how reliable Google Scholar is, but what's the difference between ISI and Scopus? Anyone look deeper into this?
If citations, h-indices and impact factors have traction as important metrics, shouldn't they be, oh I dunno, accurate?
Thursday, March 26, 2009
Then reality struck this student, rapidly disabusing them of this conceit:
"He was shocked to discover that it would take him such a long time to learn the technique (he's starting from level 0) and said that it seemed so easy when reading it from some published paper!"
Every newb thinks that a technique they haven't mastered is easy, until they actually try it. And in fact, the bare bones mechanics of patching are pretty straightforward. I've taught a lot of novices how to patch, and by and large they can get to the point of gigaohm seals in a week or two (ok, we're talking transfected HEK cells here). Hell, I'm thinking any primate above lemurs could learn to get seals. (Not a bad idea actually; screw those automated patch systems, gimme an army of squirrel monkeys and an old warehouse, and I'll screen your chemical library right quick! It'd be like the nut shelling squirrels in Willy Wonka. And they'd literally be DrugMonkeys! LOLZ.)
The transition between the crappy recordings of the apprentice and the regular good recordings of the master takes a long, long time, on the order of a year I'd say.
These are the dark times, where the progress is non-existent, perhaps to a greater extent than an analogous part of the curve for other technical subspecialties. Most electrophysiologists I've talked with had this time in their training, typically falling into the 2nd year of graduate school.
And yet, there's very little useful advice the masters can give their apprentices during this time, other than "keep at it". Sure, there are suggestions to try this, or don't do that. In the end though, everyone just has to put in their time, slowly perfecting each requisite skill, and evolving their own personal technique. It sucks, for sure, but it does end.
It's just not gonna end before your rotation or your last few months before you finish your thesis.
Wednesday, March 25, 2009
First, what the heck are we even doing? Well, we're gonna pull a glass needle, fill it with salt solution, stick it on a plastic holder with a wire inside, maneuver it to a cell, apply a little suction, and let the magic of "seal formation" occur. Next, assuming we're doing whole cell voltage clamp, we break the seal membrane with more suction, gaining control of the voltage across the cells' remaining membrane, while recording the current (also filling the cell with our pipette solution). Sheesh, when you distill it down to two sentences, it pretty much trivializes what I spent years learning and do everyday.
The opening of the pipette tip will be ~1 µm, while the cell is on the order of ~10 µm. Obviously, if the pipette tip is too big, then we'll just suck up the entire cell. Not good. But, as the pipette tip gets smaller, the resistance between the pipette interior and the cell interior gets larger. Also not good. In fact, that causes a whole host of problems that are left as an exercise for the reader to derive (ok, just kidding. There's a series of posts reserved for this, with current working title: "Dr. RseriesLove, or, How I learned to stop worrying and love the fact that my currents are all wrong)
I start by cutting the capillary glass, by scoring it with a diamond pencil and breaking it off to the correct length (so it'll fit in my particular set up, given the headstage position, etc.).
Then I smooth the ends of the capillary glass with a bunsen burner flame, because jagged end (even how it comes from the factory) will scrape off the silver chloride on the wire that transmits the current from the ions in solution to the electrons in the amplifier circuitry (as well as tearing up the O-ring in the headstage holder).
Then we move onto the puller itself. There are many different kinds of pullers, but having been in a number of electrophysiology labs, I would say most people use pullers made by Sutter Instruments. The basic puller operation is to melt the glass capillary while pulling on either end, drawing the ends apart. Now to get a nice wide tip patch pipette, we use computer controlled application of the heat, allowing you to stop the heating a certain time after the capillary begins to pull apart. Over repeated heating/cooling cycles, you can make the perfect pipette.
Here's the puller, a P-97, and if you unscrewed the 5 screws on the font panel, you could peer in and see the brushless super quiet 92 mm fan we installed (way in the back of course, a real pain in arse to reach). The smoked plexiglass cover opens to reveal:
The business end of the puller. The circle thumbscrews clamp down on the ends of the capillary and maintain tension. The capillary feeds through the box filament, which gets hot when the puller is activated (sorry for the flash glare here). When the glass separates, we're left with a pair of pipettes, which we fire polish by bringing them close to a red hot wire (observing under the microscope). Finally, we're ready to patch!
The pipette is filled with intracellular solution, stuck in the polycarbonate holder (which has the silver wire in it), and stuck into the headstage of the amplifier. The suction tube allows you to provide postive pressure while you're approaching the cell, or negative pressure to form the seal and to breakthrough. The cells are sitting in an extracellular-like solution in the chamber, and the pipette approaches under micromanipulator control (here, a piezoelectric based Sutter MP-285), all the while watching through the microscope. In fact this pipette in this picture is making a GOhm seal on a little HEK cell. Of course, when I applied suction to break through, this cell was terribly leaky (again, Electrophysiologicus, patron of patchers, I'm like so over that hubris- could we maybe move on now?).
If you look closely, the tip of the pipette is wrapped with a thin strip of Parafilm. This helps reduce the capacitance of the pipette, but isn't nearly as time consuming or messy as using Sylgard. A requirement for setting the series resistance compensation. All of which are good topics for future posts!
Hope this was at least mildly useful to some people out there, and marginally enjoyable to others. If anything's not clear, just fire up the comments and lemme know.
Monday, March 23, 2009
In other news, I turned 35 yesterday. It was a great day (thanks to the wife and family!), but frankly it's a crappy age. First time I actually feel old on my birthday. *sigh*
Thursday, March 19, 2009
I've been having a recent bit of difficulty with our pipette pullers in the lab. Well, pulling program parameter searching is no news to any of the l33t electrophysiologists who frequent this blog. We've all been there, and we'll all surely revisit that terrible state of being.
But not too long ago one of the pullers started making a horrible noise when switched on, as the cooling fan must have lost a ball bearing. The company service folks were in the area for the recent Biophysical meeting (how was it Dr. Samways?), and made a swing through the lab. They offered to refurbish the whole thing and replace the fan for $2000. Not a pressing issue, but apparently as the heat builds up inside the puller, it would slowly dim the display, making it hard to edit programs.
Economic times being what they are, we demurred. Which is a good thing, because when we pulled the front panel off that sucker, the fan was just a 92 mm fan like you'd have in your computer. So we got one of those, pulled out the old dead one, and slipped in the new. Voila! And while we were at it, we got rid of the decade plus layer of dust in that thing.
Still though, I miss the old P-84 I used in my thesis lab.
Tuesday, March 10, 2009
(I'm digging the bass in this one!)
Figure 1: Sodium currents during action potential waveforms in nociceptive sensory neurons.
Tuesday, February 17, 2009
Thursday, February 12, 2009
Physioprof added a great comment as well, about his excitement at an old breakthrough achieved in grad school. You can feel his enthusiasm, even for an event that must be years old by now. And as he says, we're all chasing that feeling. I remember one of my own as well, which I will share. For my first project in grad school I was recording sodium and calcium currents during action potentials in nociceptors. As there's no good blocker for TTX-resistant sodium currents, I settled on using ionic substitution, replacing external sodium with NMDG. That worked well for the subtraction, but I did notice that the resulting sodium current kept increasing even as the voltage approached ENa, which obviously shouldn't happen. I kept that stuck in the craw of my brain, which chewed over it as I proceeded to look at calcium currents in other cells. (Which is how I normally let it happen; given time the craw digests whatever problem is given to it).
I can still remember sitting at my desk, looking over some other experiments, when it hit me. It was obvious that intracellular Na+ and K+ were making outward currents through unblocked TTX-R channels, and these became sizeable at depolarized potentials near the peak of the AP. In retrospect it isn't so surprising. But it wasn't so obvious to me that the outward currents would really be large enough to make a big difference, relative to the large inward currents when external sodium was high. Turns out they were. Very soon after that I figured out a way to correct my previous results, which became a figure on its own in the final paper. All in all a great feeling.
I think we all just hope that our science doles out sufficient number of these moments to keep us from totally giving up in the face of so many difficulties.
Friday, February 6, 2009
And both produce perverse incentives.
Let me add as a final note, if you scan people's CVs for Cell, Nature, Science papers, then you're judging by impact factor.
Thursday, February 5, 2009
Sure, most of these thoughts are neither particularly well considered nor highly developed. Indeed, they're probably not very insightful either. But the best way of improving upon them is to just start writing about them, and hopefully get feedback from other people (especially fellow working scientists). One benefit of this approach is that I'm not so wedded to my thoughts that there's a huge congitive barrier to overcome if they need altering.
With this entire morass in my head, regardless of the current state of each, I've begun by thinking explicitly about the "why?" question*. Why do we do what we're doing? What's the ultimate purpose of it? In my mind, before we can judge whether a particular way of doing things is good or bad, we need to figure out the answer to the "why" question.
So let's ask the question about Science, broadly speaking. What is the purpose of our endeavor? The answer to that question forms the ultimate basis by which its attendant ethics, practices, and structures must be judged.
My answer is that Science's aim is to produce true statements about the world.
I consider this to be the first axiom. I don't even claim this phraseology as my own, as I'm sure I must have read something to this effect over at Dr. Free-Ride's (which is the place I go when thinking about All Things Philisophical). But I can't think of a better way to put it. And I would guess that essentially all scientists would agree with it. If not, I'm all ears about what is the ultimate goal of Science.
Now onto whittling down that morass!**
* - I understand that this isn't terribly insightful. Either River Tam or Arlenna brought it up some time ago (i.e. - the hazy time that existed prior to the daughter's arrival) in a discussion about authorship order issues. Furthermore, just about every project planning book out there contains something similar. Hell, it's in David Allen's Getting Things Done book, so that means about 1.18 billion people on the Earth know it.
But knowing it and doing it are two very different things. If you don't believe me, consider just about any committee meeting you've had the pleasure of attending. How many actually started with "Why?" And how much talk was really about "How"? Besides, having done exactly this in project planning of various sorts, I'm often surprised by how useful it is in producing different ways of approaching and solving problems.
** - Something about this sentence really makes me happy.
Thursday, January 29, 2009
We've already had some bona fide cold weather this winter, with multiple sub 20 degree F days, and some nights in the single digits. And I really haven't felt terribly cold. This stands in contrast to much of the time spent here in Massachusetts, where I was always freezing. A breath of Canadian artic air would send me running for the long underwear.
And yet, I wasn't always like this. I grew up in Connecticut, which though not particularly cold is far from tropical. Then I went to college in Chicago- Ah, I remember the day, trudging to a morning class, where the air temp was -25 deg F (-50 with wind chill). Overall, I remember the cold as being present, but no big deal.
Then, I started grad school at Stanford. Palm trees! Orange and lemon trees! Chelsea Clinton! I wore shorts EVERY day that first winter. It was the El Nino year, and it rained every fricking day, but I still wore shorts. I distinctly remember the odd looks I received from those Palo Altans. They'd be wearing their hats and gloves, and I'd be traipsing across campus, bare legs and all.
But then a funny thing happened the next winter in California. I froze my ass off. I had to wear pants all winter, and even turned up the electric baseboard heat in my on-campus apartment. WHAT? Apparently, a year of living West Coasterly obliterated any and all cold tolerance I had built up. A reversal that took about 10 years to reverse.
Which brings me to what prompted me to even post this bit of boring biographical information: Is this a real phenomenon, and if so, what's the neurobiological basis for it? Do people in different geograpical locales actually perceive temperature differently? Is the difference peripheral (i.e. sensory neurons) or later in the processing? For example, what would you see if you compared the sensory neuron activity of lifelong resident of Alaska and Florida, recording their sensory neuron activity? And how does that change?...trp channels?? trp channels?? trp channels??...And is there a sex difference in this aspect (how many of us can recount the differences between our sense of temperature versus our partners?). And how does aging impact this?
The final question is:when will Spring arrive? Cause though I don't mind the cold so much, I'm already sick of the snow.