NMR Training

I have completed my NMR training today. This means that I can schedule and run my own experiments and I do not have to ask the manager of NMR facility to do it for me. That is essential for monitoring my degradation and identify metabolites using 13C-NMR.

Training was good and now after several sessions under supervision (and a final test ;)) I am comfortable using the instrument. It is a Varian Mercury 300 (running with a 300 MHz magnet; a pretty standard configuration for routine measurements). The software is VNMR running under Solaris. My UNIX knowledge comes in handy here. I have no problem using the software and manipulating my data. The challenge was to control the settings for measurements properly, namely tuning and shiming in order to obtain nice symmetric peaks.

This is going to be fun!

13C Degradation experiments by microbiology in snow samples

Well, there is some more work to be done in the realm of microbiology and I do my best applying my meagre microbiology knowledge and skills for some experiments that investigate the degradation of 13C malonic acid by microbiology (bacteria, fungi) in snow samples that I have collected in the past.

I have prepared sterile solutions, added and doped the samples and now they are sitting in the fridge for a while. Hopefully, bacteria are already happily feasting on the provided nutrient. Let’s see, what the GC-MS analysis will reveal in a couple of days.

Teaching CHEM-377B next winter term

After some discussion within the department, I will now be teaching an Instrumental Analysis course next winter term. I am sharing the responsibility for the lecture with a second person, and my teaching duties will be done at the end of January, so that I can prepare things for my sampling trip in spring.We are now co-ordinating contents with the lab course held parallel with the lecture and sorting out administrative stuff as well as check with other lecturers for overlap with their respective courses. It is a fun thing to do and I am really looking forward to some more teaching again.

It’s been a while …

… since my last entry, but this is also due to the fact that I was on vacation, spending some excellent time in Newfoundland, Labrador and the Quebec Cote Nord. Now that I am back to work – and dealing with a couple of administrative issues such as improving backups of GC-MS data by adding an Ethernet card to the data PC, I am having a close look at my recent data.

And it looks good, indeed. Data analysis is dead slow. There is a large number of compounds to compare and after my inital analysis, I have missed compounds in one run, but found them in the duplicate run, but after re-analysing the data it turns out that the compound is present in both runs (as it should be), but signals are quite weak, which is delaying my analysis.

However, once I am done with a sample, I am getting good results and even the weaker signals match well in duplicate runs. I have now analysed samples from three locations collected in 2005 (Mont St. Hilaire, McGill campus and Resolute [sampled 2004 for comparison]).

Data from McGill Campus and Mont Saint Hilaire Samples

In the past few weeks I have run samples that I have collected in spring 2005 on McGill campus and at the McGill research station in Mont Saint Hilaire. Samples were surface snow from the first 10 cm of the snow pack and I have taken grab samples with pre-cleaned and sterilised equipment.

Because of the low concentrations present in the sample I am resorting to large sample volumes that I have collected in sterile HDPE bottles. So far I have had no cross-contamination from plasticisers in the container material, which is fine. The samples from these containers allow me to obtain a total sample volume of about 130 mL and still retain about the same amount for later experiments.

I melt the samples before my measurements and once melted, I keep them on ice and covered in tin foil to avoid any exposure to UV light and minimise volatilisation of compounds. I do not (yet) have tight containers of that size, so I am using a similar set-up described by Petterson (Chemosphere, 2004) with a 125 mL Erlenmeyer and a tinfoil cover after sterile transfer of the sample.

I am still crunching my data, but from I first glance I get good signals for aromatic compounds and a few halogenated substances (mostly Chlorobenzene, tetrachloroethylene in one sample). There are surprisingly many aldehydes and alcohols present (mostly with aliphatic chains attached), which I have not seen earlier in this abundance. The absence of organo-halogens is still surprising and remains to be investigated – although a lot of explanations are possible here (age of snow pack, precipitation, …)

Collecting data …

It seems that the hard work in the last week paid off. After a good look into my data I have decided to make some (rather simple, but nevertheless effective) changes to my method:

* I have increased the sample size by a factor of 6
* I have increased the adsorption time from 40 min to 120 min

The latter had only a minor effect on the results (so I will stick with the shorter adsorption time), but the increase in sample size did the trick. The tiny peaks with only a very small number of fragments for some aromatic compounds are now clearly separated from the noise and the underlying mass spectrum passes my and the NIST library search.

I have analysed samples from the McGill campus lawn and from a site at the McGill Research Station in Mont Saint Hilaire (east of Montreal) and the main compound groups that are have detected were aromatic compounds (toluene, xylenes, benzaldehyde), aldehydes and some aliphatic alcohols.

However, I am still wondering, if halogenated compounds a present (no ever so small isotope signatures from chlorine or bromine so far) and it seems that benzene is still buried in the water peak at the beginning of the run (although I have modified my method so that it elutes later in a less contaminated section of the chromatogram – this works fine for toluene, but not for benzene). Another explanation, of course, is that these compounds are not present or only at very low levels. Well, there is more to look at.

Getting there

I have continued my GC-MS measurements during the last 2 weeks and I have come up with some encouraging results.

First of all I have modified the inlet with a special SPME liner (with a low diameter and volume for splitless injection) that should improve peakshapes – and it indeed does. My peaks are sharper and better separated, which is good, because I still have a considerable number of peaks that are bleeding (although a lot fewer since the replacement of the liner).

As a result, some of the compounds that I could not detect in previous runs (or just a few fragments indicating their presence) are now appearing more clearly. I could identify 1- and 1,2-chlorobenzene, toluene and some other compounds in my urban samples, confirming results from last year’s measurements.

The downside is that I am not quite there yet. Levels seem to be a lot lower than last year, which needs to be explained (e.g. snow properties,…) and some compounds, which I should find due to the heavy traffic just off-campus, still have not turned up.

And some more maintenance

Well, looks like the GC-MS needs a lot of attention recently. After a fatal crash (or rather more than one), the connection between the computer and the instrument was down for good. Well after uninstalling the software and some additional cleaning up by hand, we have reinstalled and reconfigured the software.

All problems solved – and the beast is running nicely again. No data were lost either – so the operation was a success. A couple of hours well spent. Tomorrow is my day on the instrument and I am looking forward to some improved results (*keepingmyfingerscrossed* despite being a scientist).

Column bleeding all over (but no violence ;)

Thanks to a nightshift of a colleague of mine, the GC-MS is up and running again. The baseline is still pretty elevated (4x as high as before), but that could also be due to the inefficiency of the previous filament. Anyway, the baseline seems to be fine.

But the column bleeding is not. I have made some tests, as a performance check:
1 – Instrument background, no injection
2- Fiber backgrounds (PDMS/DVB and Polyacrylate fibers) after conditioning
3- MilliQ blanks with both fibers

… and I have found a lot (read: too much) of bleeding in all of them. I am working in splitless mode and I purge after 3 min with 50 mL/min.

For comparison reasons, I have also tried to work in split mode (ratio 10:1, otherwise identical conditions) for one of my samples. This reduces the column bleeding significantly (to an acceptable level), but also it also reduces the sensitivty too much in order to detect compounds that I see in splitless mode *sigh*.

Finally, I ended up changing/checking the liner – and this was good, there was already quite a bit of junk accumulated. I have also moved the glasswool packing a bit towards the end in order to avoid any contact with the fiber. Additionally, I can now place the fiber more centrally in the liner during desorption – placing it closer to the column and near the temperture optimum. Ultimately, I would like to use a liner that is better suited for SPME usage with splitless injection (smaller volume and narrower bore) in order to optimise the focusing of desorbed compounds.

Some first experiments with the blank fiber showed significant improvements, most of the bleeding is gone – even in splitless mode . Let’s see, if that continues, when used with samples – I am hopeful.