Camera Related

A-001: What’s the difference between ‘unbuffered’ and ‘buffered’ cameras?

[A] From hardware standpoint, the key difference between an unbuffered and a buffered camera is: the former doesn’t have on-board memory (or ‘frame buffer’) to store captured images, whereas the latter does.

As a result, an unbuffered camera is not able to capture images until the computer (that the camera is connected to) is ready by preparing the necessary resources to store/process the images. If the computer is busy, the image acquisition would have to be delayed. In addition, when the computer is busy with one camera, it cannot support a 2nd or a 3rd camera. Therefore, unbuffered cameras are usually not appropriate for multi-camera applications.

A buffered camera, however, doesn’t have to rely on a computer – all the time – to capture images, as the acquired images can be temporarily stored in the on-board memory until the computer is available, when the captured images can be downloaded to and be processed/saved in the computer. Since buffered cameras can capture images without having to wait for the computer to prepare resources, multiple cameras can be connected to the same computer and acquire images, viz. buffered cameras support multi-camera applications.

In summary, the table below shows the key features and differences between unbuffered and buffered cameras:

 Unbuffered CamerasBuffered Cameras
Support multi-camera applicationsNoYes
Simultaneous image acquisitionNoYes
External trigger delayuncertain and longshort
Multiple camera synchronizationNoYes

A-002: I need multiple cameras for my application, what camera models should I choose?

[A] If you require multiple cameras for your applications, you have to use buffered cameras – unbuffered cameras do not work. For the difference between an unbuffered and a buffered camera, please see here.

A-003: When should I use a window-less camera?

[A] Most image sensors are covered with a glass window, mostly for protection purpose in the manufacturing process. This glass window is harmless for the majority of applications, when incoherent light sources are used, such as a halogen light bulb and an LED. In some applications, however, a coherent light source (e.g. laser) is used, and unwanted interference fringes may be generated due to multiple reflections from the top and the bottom surfaces of the glass window, leading to features in the captured images that are otherwise non-existent in the scene/object. In this case, a windowless camera should be used instead, which could help avoid the unwanted interference fringes.

One typical example that requires a windowless camera is ‘laser beam profiling’.

A-004: I am doing laser beam profiling, but have been seeing unwanted interference fringes. What should I do?

[A] This is most likely to be resulted from interference between the reflected laser beams from the top and the bottom of the glass window in front of the image sensor. This glass window is placed on top of the image sensor in the manufacturing process, mainly to protect the fragile image sensor.

In order to eliminate the unwanted interference fringes, a windowless camera should be used instead. In a windowless camera, the glass window is removed, which is the root cause of the interference fringes. For more details, please see https://www.mightexsystems.com/product-category/imaging/usb-cameras/windowless-cameras/.

Windowless cameras are more fragile, so please handle with great care.

A-005: How do CMOS imagers work?

[A] CMOS-based imager sensors are semiconductor devices. Incoming light strikes a photosensitive area, where the photons free electrons. To do that, the photons must be energetic enough. In the case of silicon, that means they must be shorter than 1,100 nm in wavelength.

What happens after the electrons are produced distinguishes CMOS from CCD sensors. In typical CMOS implementation, active amplification circuitry is connected to each photodetector, allowing active conversion of the accumulated charge into a corresponding voltage and digital signal within the pixel, which CCDs cannot do. In addition, a CMOS chip can read out pixels within an array by column- and row-selecting circuitry, which cannot be done with CCDs, either. The differences in amplification and readout account allow CMOS-based sensors to image faster and to consume less power.

However, the advantages of CMOS comes at the price. The presence of circuitry reduces the photoactive area within a pixel. There also are nonuniformities in the amplifiers that could lead to uneven pixel response. When these factors are considered, CCDs offer better sensitivities.

As for the relative noise occurring with CMOS and CCD technologies, some experts believe: for short integration time (i.e. high frame rate), CMOS does well. For large pixels, CCDs beat the CMOS in noise but, for smaller pixels (around 5 microns and smaller), the CMOS is able to catch up.

A-006: I would like to use the camera for NIR applications, how should I deal with the pre-installed IR-cut filter?

[A] All Mightex’s area cameras are pre-installed with IR-cut filters during volume production, but the filters can be easily removed by customers. Since the camera lens mount is designed with the thickness of the IR-cut filter into account, (in order to maintain proper focus) one should replace the IR-cut filter with either a plain glass plate or an IR-pass filter, depending on the specific requirements. The plain glass plate and the IR-pass filters can be ordered separately here: https://www.mightexsystems.com/product/camera-accessories-optical-filters/. Line cameras are not installed with any filters.

A-007: How do I install the camera device driver?

[A] Please refer to the camera Quick Start Guide for details.

A-008: I am looking for a camera for lab use. Could I use a board-level camera?

[A] If you are looking for a camera for lab use, it is highly recommended that you choose an enclosed camera, NOT a board-level one. This is because, on a board-level camera, all electronic components are exposed and are therefore susceptible to damage due to Electrostatic Discharge (ESD). For example, a board-level camera could get damaged if one simply touches the camera with a bare hand, if he/she is not properly grounded. ESD-related damages are not covered by warranty.

A-009: Why Windows cannot verify the digital signature for the drivers required for Mightex camera?

[A] This problem would only occur if one is using Windows XP. Since Microsoft has stopped supporting digital signatures for Windows XP (and they only support Windows Vista and 7 now), no company can roll out NEW products that are ‘digitally signed’ in Windows XP, and Mightex is no exception. However, this would not in any way affect the effectiveness of our products in Wondows XP, and the products should work just fine and there won’t be any damage to customers’ computer. There would only be some ‘inconvenience’, as customers would be ‘warned’ by Windows’ system that the products are not ‘digitally signed, and damage might be done” to their systems – which BTW is NOT true if you know the software is from a secure source (i.e. Mightex in this case).

The above issue, however, should not exist with Windows Vista and 7.

A-010: Why the sample codes in C++/C#/VB included on the CD do not compile?

[A] Mightex’s camera demo software is originally written in Delphi, and the sample codes (in C++, C#, VB etc.) included on the CD are only meant to show how to call the API’s, and they are NOT fully-functional codes. As one can understand, it is not possible for our product development team to develop the same software in four different computer languages. In other words, the sample codes are for you to read but not to compile/run, as they are incomplete.

A-011: I’m sending TTL trigger signal from a function generator to multiple cameras to capture images in synchronization. The image data arrives at my program (i.e. through frame callback function) with variable delay. How should I match each set of images that were triggered by the same TTL pulse and captured at same point in time?

The variability in the callback function is normal, which is due to Windows not being a real-time OS.  The execution of the callback function is based on Windows task management and scheduling, which can be variable, even though the callback function is set to the highest priority thread.

To synchronize between multiple cameras, the recommended method is to use the TriggerEventCount and TimeStamp frame attributes associated with the callback function. For further information, please refer to the SDK manual for the specific camera model (generally found in “\SDK\Documents\” folder of the camera’s software package).  As an aside, when using the function generator between multiple cameras, use a common shared trigger if possible to ensure good synchronization.