> No, the PSF is a quality of the optics. The PSF is what is scanned. The > impulse response is what is elongated. The measurement value that the > pixel represents is (in my simplistic understanding of signal > processing) an integral over time of a moving PSF that is varying in > intensity as a function of time. The location of the pixel is calculated > from the scan mirror and stage positions. >
Sorry---I have been using the wrong term. What I was referring to is the sampling kernel, the whole function that consists of all effects during transfer from a physical world phenomenon into a pixel sample value. That includes the PSF, the motion of the sensor specimen or sensor (motion blur), quantization of the signal in the sensor, noise and what ever. In a microscope, this kernel is some difficult to obtain n-dimensional thing (space, time, wavelength, m properties of the dye and specimen), target of the ideal deconvolution framework. > > However, that PSF would still be symmetric, except > > if the sensor has a tendency to collect photons only later or earlier > > during exposure. > > Such as the situation if the scan is accelerating or decelerating during > acquisition which happens in many designs: one 'side' of the pixel will > tend to be brighter than the other 'side' (one end of the integral will > be longer than the other). > Good to know---has it been measured ever? Would be nice to include it as a trivial to understand part of the above mentioned kernel. > > So to make my statement clearer, a point sample taken > > from the physical world is almost never a point-sample, but it's > > representative coordinate is most likely not at the top left corner of > > the area between samples. > > Agreed, if we want to relate pixels back to the real world in some > reliable fashion (that's the general goal, right?) > Yes. Cheers, Stephan _______________________________________________ ImageJ-devel mailing list [email protected] http://imagej.net/mailman/listinfo/imagej-devel
