sensors rather than cameras

I couldn’t find any previous discussion on sensors, so I’m asking here…

The current camera descriptions are not rich enough to accommodate all the parameters I need for different types of sensors - like LIDAR. It has sweep angles, frequency of samples, … (note laser ranging sensor is it’s own light source).

I know there’s <extra> available, but …

I was wondering if anyone knew of any efforts to marry SensorML with COLLADA camera descriptions using xslt. If not, could someone suggest steps to do this?

Also, I like the idea of abstracting “camera” to “sensor”. … since thermometers can “see” the environment, a GPS can/can’t have line of sight to sattelites, …, accelerometers, microphones,

… While I’m thinking about it, <light> doesn’t accommodate a <sound> source.

Please don’t misunderstand me. This is not criticism. I’m mainly wondering if anyone else has the same interests and maybe already has <extra> tags that describe other modalities of physics other than light and solid body dynamics.

Thanks.

Does anyone have an extension that describes light sources in terms of wavelength rather than color?

This sounds like implementation question, but has design implication.

Thanks.

Not that I’m aware of.

The COLLADA camera is designed to represent electromagnetic radiation sensors in general. The <optics> element describes the refractive and reflective elements while the <imager> describes the sensor characteristics.

Designing extensions for your use-cases would involve the creation of alternative <technique> elements within <optics> and <imager>. I don’t think it would be enough to add <extra> data to the <technique_common> representation of <camera> (which is a lens that projects a planar image onto an RGB sensor).