FAQ  •  Register  •  Login

Does anybody know how DVS calculates normalization factors?

Forum rules
Please be as geeky as possible. Reference, reference, reference.
Also, please note that this is a mixed bag of math-gurus and mathematically challenged, so choose your words wisely :-)
<<

tesla276

Participant

Posts: 4

Joined: Thu Feb 26, 2015 8:34 pm

Post Thu Feb 26, 2015 9:01 pm

Does anybody know how DVS calculates normalization factors?

I've recently been interrogating the exact way that the DVS Normalization works and I've come to a bit of a strange finding. Their literature suggests that they look at the ratio between the medians observed for the beads and the global standards they have determined (Passport), and then linearly interpolate between these values to determine the appropriate scaling factors for all mass channels. I've compared my files before and after normalization and my findings don't exactly match the description of their methods. I've attached a file showing the plot of scaling factors used per channel and they aren't linearly interpolated between bead channels.

I'm trying to figure out where this parabolic profile comes from. The only thing I've recognized so far is that this curve is very similar to the curve showing the sensitivities of the various mass channels, peaking right at 165. I was wondering if anyone had any insight into this? Thank you very much!!

FILE: https://drive.google.com/open?id=0B8zwG-viIv7hSUFsVE5tYUgzTFk&authuser=1

-Nick
<<

mleipold

Guru

Posts: 5792

Joined: Fri Nov 01, 2013 5:30 pm

Location: Stanford HIMC, CA, USA

Post Fri Feb 27, 2015 5:40 pm

Re: Does anybody know how DVS calculates normalization facto

Hi Nick,

I'm sure Fluidigm will chime in on this later, but I remember a similar discussion (not on Cytoforum) years ago about the Di calibration on the tuning solution.

As I remember it, in short, the standard (tuning solution, or beads) has several isotopes that span the mass range of the instrument. Each of those isotopes is exactly calibrated during calibration. Those points then become the framework for the signal profile for the instrument for everything else, by two-sided interpolation or one-sided extrapolation from the nearest exactly calibrated point(s).

So, if there's a curve in the calibrated points, that should be recapitulated in the Normalized data.


Regarding your example data file: something looks really funny with your Ho165 results. Your Eu151 and Lu175 also look odd, but not as super-far-off as the Ho165. I don't have any idea of what could cause that....


Mike
<<

anitamkant

Master

Posts: 51

Joined: Mon Nov 18, 2013 6:30 am

Post Fri Feb 27, 2015 7:30 pm

Re: Does anybody know how DVS calculates normalization facto

Hi Nick, Thanks for starting an important discussion. Mike, Thanks for the input.
Fluidigm is in the process of adding more explanation to the description of the algorithm with experimental details.
We will post the information in the near future.
Thanks
<<

tesla276

Participant

Posts: 4

Joined: Thu Feb 26, 2015 8:34 pm

Post Sat Feb 28, 2015 2:34 am

Re: Does anybody know how DVS calculates normalization facto

Mike and Anita, thank you very much for your response. I too am confused by the deviation of several of the Bead channels (151, 165, and 175) from the pattern, especially if they are being used to determine the scaling factors in the first place. I have asked Fluidigm about this and sent them some data files before and after normalization. When they get back to me I'll be sure to post their response here.

Thank you!
Nick
<<

vmotta

Participant

Posts: 13

Joined: Mon Jan 26, 2015 10:34 pm

Post Sun Feb 12, 2017 3:37 pm

Re: Does anybody know how DVS calculates normalization facto

Hello Everyone,

I am wondering what one should expect after normalizing the files using the latest version of the Fluidigm software (I think 6.5 and beads passport version 2).

I have performed the normalization and compared the files before and after. I am not sure how it should work, but I would like to ask your opinion if everything is working well for me.

I am sending my analysis attached.

You will see the dual counts for Beads (140 channel); Intercalator (191) and CD45 (147) in 5 different samples of mouse cells acquired on the same day.

It is easier to see that the Bead reading is different between samples and after normalization they become very similar. However, I was expecting the ratio of median counts (original file/normalized) to be the same across all the channels. It does not happen like that. There might be a more sophisticate calculation for normalization, but I just wanted to make sure it is working properly in my hands.

The first 2 pages are the histograms done in Cytobank for Beads, Intercalator, CD45. The table shows the median for the counts before and after normalization for each sample.

The last 2 pages are the same files put together using R language for visualization purposes.

Thank you for your feedback

Vinicius
Attachments
TempNormalizationCyTOF.pdf
(958.9 KiB) Downloaded 512 times
<<

mleipold

Guru

Posts: 5792

Joined: Fri Nov 01, 2013 5:30 pm

Location: Stanford HIMC, CA, USA

Post Mon Feb 13, 2017 4:06 pm

Re: Does anybody know how DVS calculates normalization facto

Hi Vinicius,

Could you tell us more about what these files are? Such as:
1) Are they replicate samples (ie, same donor aliquot, split, stained separately with the same reagents on the same day, run on the same day)?
2) Are these separate aliquots of the same donor, stained on the same day with the same reagents, run on the same day?
3) Are these different donors, stained on the same day with the same reagents, run on the same day, back to back to back?

There's nothing obviously wrong to me regarding how your data looks, pre- and post-normalization. The reason I ask about whether these are the same donor (or even different aliquots of the same donor) is that reagent staining intensity differences can have experimental reasons: washes (including efficiency of resuspension), slight variations in marker expression or Ir uptake, etc. And this is on top of any true biological differences between different donors. If those are "true" differences, then normalization shouldn't remove them.

Regarding different (original/normalized) differences in the Bead isotopes: if your instrument varies in mass sensitivity response (see Tricot et al) compared to the instrument(s) that Fluidigm used to determine the EQ bead "correct" values, then your "original" numbers would be "forced" to the "correct" numbers by varying amounts.


Mike
<<

vmotta

Participant

Posts: 13

Joined: Mon Jan 26, 2015 10:34 pm

Post Tue Feb 14, 2017 2:20 am

Re: Does anybody know how DVS calculates normalization facto

Hi Mike,

Thank you for your response.

These are different samples. Samples A and B are one type of cell preparation from mouse A and B. Samples C and D are a different cell preparation also from mouse A and B. Sample E is control splenocytes from mouse A. They were all stained on the same day with the same antibody mix.

I thought the normalization would be a simple math using a ratio from the beads in my files against the Passport EQ beads that would be apply to all channels. But if I understood it correctly after discussing with Narges (fluidigm), the normalization takes into account the different sensitivity for each EQ beads mass (140,142,151,153,165,171 and 172).

I am wondering if you always normalize your .FCS files before sending the files to clients. i.e., should one always normalize the files?

Thank you

Vinicius
<<

mleipold

Guru

Posts: 5792

Joined: Fri Nov 01, 2013 5:30 pm

Location: Stanford HIMC, CA, USA

Post Tue Feb 14, 2017 4:13 pm

Re: Does anybody know how DVS calculates normalization facto

Hi Vinicius,

I'm not sure what you mean by "the normalization takes into account the different sensitivity for each EQ beads mass (140,142,151,153,165,171 and 172)".

Regarding customer files: it depends on what the customer asks for. If the customer requests normalization, then we perform it. However, we generally try to be agnostic about the normalization method: most people want the Fluidigm method, because that's "easiest". I personally disagree with the Fluidigm method; I believe that the MATLAB method is better, since it doesn't force your data to conform to an external standard that may not be reflective of your machine.

As an example:
Prestained-machine models-normalizations.pdf
(68.15 KiB) Downloaded 394 times


Here, separate aliquots of the same prestained sample (stained, frozen, then thawed similar to Sumatoh et al) were run on different machines: a CyTOFv1, a CyTOFv2, two different Helios instruments, and the same CyTOFv2 after it was upgraded to a Helios. At the top are the EQ bead results. At the bottom are various cell marker signals. At the left is the raw data; center is Fluidigm-ver2 normalized; and right is MATLAB-normalized.

The pink (CyTOFv1) bead profile has a different mass sensitivity profile than the v2 or Helios instruments. Similarly, the different instruments have different signal intensities/sensitivities overall. As you can see, the Fluidigm-ver2 method forces all of them to conform to the same values, in several cases decreasing signal intensity and in the case of the CyTOFv1 data, warping it to fit the mass sensitivity profile. The MATLAB method, on the other hand, just averages the signal intensity, with minimal warping.


Is normalization required in all cases? It depends on what you're asking. If you want to compare signal Medians (fold-change, clustering, etc), then it's probably necessary in all cases. However, if you have a well-designed panel and you're just comparing frequencies, then it's not always needed.

As an example:
Normalization-Freq Parent vs Signal Median.pdf
(222.87 KiB) Downloaded 401 times


Here, the same sample was stained, washed, and then split in two. One aliquot was run first sample of the day, the second aliquot was run last sample of the day (~8hr later). I then divided the Late sample by the Early sample, for both Freq Parent and for Median intensity. Normalization definitely helps make Median intensity values more stable, but Frequency isn't affected nearly as much. I've done this sort of assay at least twice, and it's consistent.


In short: normalization won't really hurt, and it can definitely help. But it's important to choose a normalization method and stick with it, for consistency. And, of course, save a copy of the Raw files in case you decide that you want to switch normalization methods!


Mike
<<

vmotta

Participant

Posts: 13

Joined: Mon Jan 26, 2015 10:34 pm

Post Wed Feb 15, 2017 9:28 pm

Re: Does anybody know how DVS calculates normalization facto

Hi Mike,

That is great. Thank you so much for sharing and for your feedback.

Best

Vinicius

Return to CyTOF data analysis

Who is online

Users browsing this forum: No registered users and 11 guests