Home » Dynamic Light Scattering, Particle size, Tech Talk, Zeta potential, Zetasizer

What causes the most headaches for a particle characterization scientist?

28 June 2018 No Comment

Written by:

What causes the most headaches for a particle characterization scientist?

There are many answers to this question, but one of the most obvious is dust or contaminants! As someone who regularly uses Dynamic Light Scattering (DLS) to measure particle size, the possibility of there being dust or contaminants in my sample is one of the things I always bear in mind.

The way I prepare my sample, the way I store my sample before measurement, remembering to put the measurement cap on the cuvette, but only after I’ve cleaned it using an air duster…. all these steps are vital to prevent some minuscule amount of dust stealthily finding its way into my cuvette and sample.  I know that if I don’t make a huge effort to reduce the possibility of dust contamination, I will have a really hard time when it comes to interpreting my results.

Zetasizer Pro &Ultra

Now, with the launch of the new Zetasizer Pro and Zetasizer Ultra, there is finally a tool to help us particle scientists ease the pain! It’s called Adaptive Correlation and it’s a new way of classifying DLS data.  Simply put, Adaptive Correlation is a clever algorithm that works in the background during a sizing measurement, determining which parts of the data generated belong to the same sample population. If anything is measured that is not typical of the sample, a transient event such as a contaminant, Adaptive Correlation will pick this up.

This is demonstrated in the figure below, which shows particle size distributions for aliquots of lysozyme filtered through 100 nm and 20 nm filters, with data captured using Adaptive Correlation and an alternative ‘dust rejection’ algorithm. Adaptive Correlation gives better resolved and more repeatable results for a sample that has passed through a 100 nm filter than the alternative ‘dust rejection’ algorithm manages with the same sample after passing through a 20 nm filter!

Results

Adaptive Correlation, therefore, provides a more robust answer to the question, “What is the real size of my particles in this sample and what does the size distribution look like?” I no longer have to worry about transient events, nor do I have to filter repeatedly with expensive filters to get a really clean sample. I can look at my data, and see the main population, called ‘steady state’ in the ZS Xplorer software, and I can also, if I want, see the transient events – those that were deemed not representative of the sample population. I can also see how many of my sub-measurements were classified as either ‘steady state’ or ‘transient’, allowing me to track changes in samples over time.

For more information about Adaptive Correlation, download our new application notes: