Overview
Description
Thermal Model Comparison (TMC) is a software tool used to identify defects on electronic devices such as circuit board assemblies. Many populated circuit board defects such as shorts and stressed components cannot easily be identified easily using conventional methods such as ICT, FT, AOI, and AXI. Much time is spent debugging boards with such defects and often these boards end up in the scrap pile. TMC tests provides an alternative method to isolate these defects, thus filling the gaps between conventional debugging techniques.
Figure 1: Thermal model comparison
Test Overview
A Thermal Model Comparison test involves comparing a device's thermal behavior to the thermal behavior of operational devices (Model Group). The thermal comparison is made over a specific period of time and at a number of consecutive time intervals determined by settings in the Test Time window. To create the thermal model, one or more operational devices are tested under power in order to develop the Model that defines how non-defective devices should behave thermally. Less complex devices may need to be powered for only a short time. More sophisticated devices, however, may need to be energized using diagnostic or functional tests in order to exercise all electronic components. After the model has been created, test devices are powered in the same manner as those in the Model Group and then their thermal behavior is compared to the model. Thermal deviations and from the model and their severity are displayed to aid in troubleshooting.
TMC can detect very small temperature differences between functional and defective devices that are nearly impossible to detect using any other temperature measurement method. Entire circuit boards can be inspected at once, regardless of component density, and without contacting the board. Hundreds of thousands of individual infrared detector elements in the thermal camera act as virtual test probes.
TMC employs a software process called Image Subtraction wherein a reference thermal image of an unpowered device is recorded and then subtracted from each thermal image that is captured while the device is powered. Because subtracted images represent temperature changes, ambient temperature fluctuations from test to test do not affect test sensitivity and results.
Don't Move: Because TMC tests involve pixel-by-pixel comparisons between operational and test devices, the camera and device positions must be fixed and must not be change.
No Touchup: Disable touchup calibrations when conducting TMC tests to prevent touchup-related image artifacts and also because the process of subtracting images removes noise that touchup calibrations are intended to correct.
Test Process
At the end of each test, the sequence of thermal images captured during the test are automatically saved with a unique ID. Thermal sequences of operational devices can be added to the Model Group and used to create the Model. Defective devices that have been successfully troubleshooted are added to the Fault Group along with failure mode and repair details. After the thermal sequence has been saved, it is compared to the Model and the results of the comparison are displayed. Areas on the device that are outside the acceptable range of the Model are displayed in color with pixel values indicating the magnitude of variance from the model. The Fault Group can be automatically searched in order to identify defective devices with similar thermal behavior.
Basic Functional Verification
TMC can be used to verify the basic functionality of electronic devices such as circuit board assemblies when no functional test is available. No custom hardware is required, and tests can usually be setup in less than an hour. TMC is ideal for testing basic functionality when the high cost of a dedicated functional tester cannot be justified. More thorough functional validation is achieved when all electronic components on a device are electrically exercised.
Test Results
Test Results consist of a comparison sequence of images representing differences in thermal behavior between the test device and the Model during the test duration. An image in the comparison sequence will display values greater than zero to indicate components or areas on the test device having higher temperature than the Model. Similarly, values less than zero indicate areas having lower temperature than the Model. Each of these areas, called Fail Sites, are enclosed in a rectangle and are numbered consecutively according to severity.
Certain types of failure modes, such as some low-resistance short circuits, can sometimes be identified immediately or isolated to a small area on the device. Other failure modes, such as open circuits or missing components, may produce secondary thermal effects resulting in a number of suspect areas. In such cases, fault location can sometimes be determined by locating the earliest thermal difference in the comparison sequence. If the earliest thermal difference does not indicate the true fault location, the entire comparison sequence should be examined to identify additional Fail Sites. Knowledge of device functionality and repair history and access to device schematics will reduce troubleshooting time significantly. By considering the severity of Fail Sites and the time in which Fail Sites occur during the test, the defect search area can be significantly reduced.
The following is an example of the troubleshooting process when multiple Fail Sites are detected. Upon initial examination of the comparison sequence, an area containing a number of components is discovered to be lower in temperature than the Model. This could indicate that one or more of the components is not functioning properly. After inspecting the areas under X-ray, no internal defects are found. Device schematics indicate that the functionality of the suspect components is controlled by the I/O output from a neighboring component. Although the comparison sequence revealed no thermal differences between this component and the Model, it was inspected and believed to be the source of the failure. The diagnosis was verified by swapping out the component and retesting the device.
Fault Search
After a device has been tested and successfully troubleshooted, failure mode and repair details are added to the device test file which is added to the Fault Group. When testing devices in the future, the Fault Group can be searched to identify close matches ranked by similarity percentage. As the Fault Group grows, troubleshooting time decreases.
Main Window and Tabs
The topics in this section cover Thermal Model Comparison windows and tabs.
Main Window
Description
The Model Comparison (TMC) window (see Figure 2) helps troubleshoot electronic devices by assessing the thermal differences between faulty and operational devices. To open this window, click the Thermal Model Comparison item under the Testing menu or press the button in the Testing section of the Shortcuts toolbar.
Figure 2: Thermal Model Comparison window
Project Folder and Data
The following data is saved when a project is saved. When a project is saved, a project folder is created with the same name as the project settings file to contain the project database, test sequences, regions, notes, and overlay.
Project Settings |
Project settings are saved in an ASCII text file with .otmc extension. All settings accessible in the Thermal Model Comparison, Test Time, Model Settings, Overheat Protection, Relay Setup, Positioning Blocks, and User Preferences windows. Additionally, there are several lockable settings that are saved including Image Averaging, Camera Lens, and Calibration Range. Touchup Calibration is also saved but is not a lockable setting. |
Project Database |
A Microsoft Access OLE DB 12.0 database located in the project folder is used to contain the following project data: Test ID, test data/time, test notes, Model Group ID, Fault Group ID. |
Test Sequences |
Each test sequence is saved in a folder named "Test History" within the project folder. This directory is created within the project folder after the project has been locked and a device has been tested. Test image sequences are automatically saved in this folder at the end of each test. The saved sequence file and associated thumbnail are given the same name as the test ID. |
Project Regions |
Regions on the Main Image can be saved in a binary file with .orgn extension and located in the project folder. Project Regions are opened and displayed when the project is opened. |
Project Notes |
Notes entered into the Project Notes window can be saved in a binary file with .onts extension located in the project folder. Project notes are opened and displayed when a project is opened. |
Project Overlay |
An overlay can be saved with .bmp extension in the project folder. The project overlay is loaded as the current overlay when a project is opened. |
File
Select the TMC project file (.otmc) to open. When a project is opened, all tests added to the Model Group in the project database are loaded into memory to speed Model creation. Tests added to the Fault Group, however, are not loaded into memory as the possibility of large numbers of tests could result in slow computer performance. Note: The file name of the most recently opened or created project file is displayed at the bottom of the window. |
|
Create a new TMC project file. When creating a project file, the file name that you provide is appended with ".otmc" and saved in the "Optotherm\Thermalyze\Thermal Model Comparison" folder, unless a different folder is selected. Note: The file name of the most recently opened or created project file is displayed at the bottom of the window. |
|
Close the current project. Tip: It is a good idea to close project files when you are finished with the project to prevent accidentally making changes to the project file settings. |
|
Lock the current project. When a project is locked, the Lockable Settings cannot be changed. The lock icon on the top toolbar changes to While a project is open, changes to Lockable Settings can only be made while the project is unlocked. Once a project file is locked, it cannot be unlocked. This feature prevents inadvertent changes to settings that can invalidate prior test data. Project files should only be locked after test setup is complete and no further adjustments need to be made to lockable settings. Note: Test sequences are automatically saved at the completion of each test only if the project is locked. |
|
Save Project As |
Save the current project under a new name. Note: All of the project files under the former name are not deleted to prevent accidently losing important data. You must manually delete these files in order to remove them from the hard disk. |
Save Project Settings |
Save all program settings to the project settings file. |
Save Project Regions |
Save Regions on the Main Image to the current project folder. Regions can be used to define specific areas on a device and to exclude components, such as power leads or connectors, that may cause test failures due to high and inconsistent heating. |
Save Project Notes |
Save notes entered in the Project Notes window to the current project folder. Notes are used to enter descriptive information about the devices under test. |
Save Project Overlay |
Save the current overlay to the current project folder. Overlays are used to quickly locate fail sites on a device. |
Setup
Select this menu item to open the Test Time window and set the test time and analysis rate. |
|
Select this menu item to open the Instrument Control window. The instrument output is used to power devices during TMC tests using Keithley source measure units. |
|
Select this menu item to open the Relay Setup for Thermal Model Comparison window. The relay outputs are used to control power to electronic devices during TMC tests. |
|
Select this menu item to open the Positioning Blocks window and record the locations of fixture blocks used to position devices. |
|
Select this menu item to open the Short Circuit Test window and set the short circuit detection temperature increase and time. |
|
Select this menu item to open the Model Settings window and set the criteria for creating the Model. |
|
Select this menu item to open the Project Notes window and enter information associated with the project. |
Tabs
Controls and parameters for testing devices are separated into the following five tabs:
-
New Test tab: Enter a new test device ID and start a new test.
-
Test History tab: View test image sequences, add tests to the Model Group or Fault Group, and add troubleshooting notes.
-
Test Results tab: Locate thermal differences between a device and Model and search the Fault Group for similar thermal behavior.
-
Model Group tab: View Model Group test sequences, create the Model, test the Model Group against the model, and remove tests from the Model Group.
-
Fault Group tab: View Fault Group test sequences and remove tests from the Fault Group.
Project Status
Model |
Displays when the Model needs to be updated due to changes in the Model Group or Model Settings. |
Project File |
Displays the file name of the most recently opened or created project file. |
New Test Tab
Description
The Thermal Model Comparison window New Test tab (see Figure 3) contains controls and settings for starting a new Thermal Model Comparison (TMC) test. Each test is performed over a specific time period and at a number of consecutive time intervals determined by settings in the Test Time window. Captured thermal images during the test are saved in memory as an image sequence and are saved to hard disk at the end of the test. Test sequences can be reloaded and retested using the Test History tab.
Figure 3: New Test Tab
New Test Information
ID |
Enter a unique name for the new test device such as its serial number. The ID is used as the file name of the test image sequence when it is automatically saved at the end of the test. |
Notes |
Enter descriptive information about the manufacturing and operational history and state of the device. Known operational devices that will be added to the Model Group should be designated as "Operational" or similar descriptor. Tip: Notes can be added or changed later on the Test History tab. |
Test Options
Check ID Before Test |
Check this box to require pressing the Check ID button before starting each new test. If this box is unchecked, the ID is automatically checked immediately after pressing the ON button. |
Restore Color Palette Defaults Before Test |
When a test ends, it is common practice to alter the Color Palette Max and Min values while reviewing test result images. Check this box to set the Palette Max and Min values back to the default values at the beginning of each new test so that live thermal events during the test can be compared under the same color palette settings from test to test. |
Erase ID and Notes After Test |
Check this box to erase the ID and Notes fields after each test ends so that duplicate information is not mistakenly entered for the following test. |
Run Test
Check ID |
Click this button to check if an ID has already been used. |
ON/OFF |
Press this button to start a new test. |
Image Stability: |
This field displays the temperature stability of the device being imaged. Before starting each test, make sure this value is close to the Noise Equivalent Temperature Difference (NETD) of the camera/lens combination to ensure that the temperature of the device has stabilized after being handled or previously tested. See the camera and lens specifications for NETD information. For example, the NETD of the IS640 camera with Macro lens is < 40mK. Accordingly, image stability should be close to 0.04°C prior to starting a new test. Note: If one or more regions exist, image stability is calculated only within the areas enclosed by regions. |
Time Remaining: |
This field displays the time remaining in the test. |
Test History Tab
Description
The Test History tab (see Figure 4) contains controls and settings for performing the following tasks:
-
Search for and load tests
-
View test image sequences
-
Delete tests
-
Change test ID or notes
-
Add tests to the Model Group
-
Add tests to the Fault Group
When the Test History tab is selected and image capture is not activated, the loaded test sequence images are displayed in the Main Image. Use the controls in the Image Sequence panel to view individual images in the sequence.
Image Subtraction: If the ON/OFF button is pressed in the Image Subtraction panel, test images will be displayed by subtracting the thermal image of the unpowered device. Subtracted images represent temperature changes from the moment power was applied to the device. If the ON/OFF button is unpressed, test images are displayed as temperature images.
Figure 4: Test History Tab
Loaded Test
ID |
This test box displays the unique ID of the loaded test. The ID of the loaded test can be changed by altering the ID in this text box and then clicking the Save Changes button. |
Tested |
This field displays the date and time of the test. |
Notes |
This text box displays the notes associated with the loaded test. The loaded test notes can be changed by altering the notes in this text box and then clicking the Save Changes button. Tip: Failure mode information is often added to notes after a defective device has been successfully troubleshooted. Descriptive defect information is helpful when performing a Fault Group search. |
Save Changes |
Click this button to save any changes that were made to the loaded test ID or notes. |
Loaded Test Actions
Compare to Model |
Click this button to compare the loaded test to the Model and then display the comparison sequence. |
Add to Model Group |
Click this button to add the loaded test to the Model Group. Important: The loaded test (not the test selected in the Test History table) is added to the Model Group. |
Add to Fault Group |
Click this button to add the loaded test to the Fault Group. Important: The loaded test (not the test selected in the Test History table) is added to Fault Group. |
Test History Table
To aid in searching for specific tests, the Test History table can be reordered by clicking the heading for each column. Click the same column heading to alternate between ascending and descending order for the data in that column.
Load Selected Test |
Click this button to load the test selected in the table into memory. |
Resize Table |
Click this button to automatically resize the Test History table to fit the data in the ID and Tested columns. |
Delete Selected Test |
Click this button to delete the test selected in the table from the project. Caution: Deleted tests are removed from the hard drive. |
Search Test History
Show All Tests |
Click this button to display all tests in the project. |
Total Number of Tests |
Displays the total number of tests in the project. |
Search Dates |
Click this button to display only those tests conducted between the dates selected in the From and To fields. |
Search ID |
Click this button to display only those tests whose ID contains the text entered in the Search ID field. |
Search Notes |
Click this button to display only those tests whose notes contain the text entered in the Search Notes field. |
Test Results Tab
Description
The Test Results tab (see Figure 5) contains controls and settings for performing the following tasks:
-
View the thermal comparison between a device and the Model
-
Locate thermal differences between a device and the Model
-
Search the Model Group for tests with thermal behavior similar
When the Test Results tab is selected and image capture is not activated, the comparison image sequence is displayed in the Main Image. Use the controls in the Image Sequence panel to view individual images in the sequence.
Color Palette: A separate set of values for the Color Palette Max and Min and defaults are saved in memory when the Test Results tab is selected. Therefore, the Min and Max values and defaults will change when the Test Results tab is selected. Use this feature to save a different set of palette values that is optimized for viewing the comparison sequence images.
Figure 5: Test Results Tab
Comparison Sequence
When the thermal behavior of a device is compared with the Model, a comparison sequence is created by comparing each image in the test sequence to a corresponding set of pixel limits in the Model. When a pixel value in the test sequence image falls within the Model high and low limits, the corresponding pixel in the comparison sequence is set equal to zero. When a pixel value in the test sequence image falls outside the Model limits, the corresponding pixel in the comparison sequence is set to the difference between the pixel value and the closest limit.
Fail Sites
A fail site occurs when a group of pixels in a test sequence image (with size equal to or larger than the Fail Site Radius) fall outside of the Model limits. Fail sites are enclosed in a rectangle and are numbered consecutively according to severity. The most severe fail site is labeled with number 1. A value representing the severity of each fail site is displayed in parenthesis to the right of the number. A higher severity magnitude indicates a higher probability that the fail site is associated abnormal thermal behavior. Displayed to the right of the severity value is a number representing the magnitude that pixels within the fail site exceeded the Model limits. Negative severity and temperature difference values indicate that the fail site was colder than the Model.
Fail Sites: A fail site can occur on any image (not just the last image) within the comparison sequence.
Loaded Test
ID |
This field displays the unique ID of the test that was most recently compared with the Model. Note: If this ID does not match the ID of the test currently loaded from the Test History tab, a message is displayed below this field. This can occur if a different test is loaded after a comparison has been performed. |
Notes |
This field displays the notes associated with the test that was compared to the Model. |
Test Results
Regions: Regions define image areas that are evaluated when making comparisons with the model. If no Regions exist, the entire image is evaluated.
PASS/FAIL |
This field displays the results of the most recent comparison of a test sequence with the model. PASS indicates that there were no fail sites detected during the comparison. FAIL indicates that there was at least one fail site detected during the comparison. Note: A comparison will pass only if there are no fail sites detected on any image in the comparison sequence. |
Results Status |
When tests have been added or removed from the Model Group or when Model Settings have been changed, a message is displayed below the PASS/FAIL field indicating that the test should be re-compared to the Model. |
Image Fail Sites |
This field displays the total number of fail sites in the comparison sequence image currently displayed. |
< > |
Click these buttons to display the previous and next fail sites on the currently displayed comparison sequence image in order of severity. |
Show All |
Click this button to display all fail sites on the currently displayed comparison sequence image. |
Number Shown |
Select the number of fail sites to display on the current comparison image. Note: If this number is smaller than the total number of fail sites on the current comparison image, some of the fail sites will not be displayed. |
Invert Color |
The default color of fail site rectangles is white. Check this box to change the color of fail site rectangles to black. |
Fault Group Search
Row Order: To aid in searching for specific tests in the Fault Group Search table, table rows can be reordered by clicking a column heading. Click the same column heading to alternate between ascending and descending order for the data in that column.
Search Fault Group |
Press this button to search the Fault Group for similar thermal behavior when tested against the Model. Each test that has been added to the Fault Group is opened in turn and compared with the Model. Fault Group tests are listed in the table in order from most to least similar. A similarity value indicates the percentage of matching fail sites. Tip: When searching for similar tests in the Fault Group, notes describing the root cause of a defect can be very helpful in troubleshooting. Make sure to add descriptive notes for each test added to the Fault Group. |
Resize Table |
Click this button to automatically resize the Fault Group Search table to fit the data in the ID, Similarity, and Tested columns. |
Model Group Tab
Description
The Model Group tab (see Figure 6) contains controls and settings for performing the following tasks:
-
Search for and view Model Group test image sequences
-
Create the Model
-
Test the Model Group against the Model
-
Remove tests from the Model Group
When the Model Group tab is selected and image capture is not activated, sequence images of the Model Group test selected in the Model Group table are displayed in the Main Image. Use the controls in the Image Sequence panel to view individual images in the sequence.
Figure 6: Model Group Tab
Model
The Model defines the thermal acceptance limits for each pixel in every image of the test sequence. The Model is created by applying Model Settings to the Model Group.
Model Group Table
Tip: To aid in searching for specific test in the Model Group table, rows can be reordered by clicking a column heading. Click the same column heading to alternate between ascending and descending order for the data in that column.
Create Model |
Click this button to create the Model. |
Compare to Model |
Click this button to compare the selected Model Group test to the Model. |
Compare All |
Click this button to compare all Model Group tests to the Model. Tip: This is useful for identifying Model Group outliers that should be removed from the group. |
Resize Table |
Click this button to automatically resize the Model Group table to fit the data in the ID and Tested columns. |
Remove Selected Test |
Click this button to remove the test selected in the Model Group table from the Model Group. Note: The selected test is not deleted, it is simply unmarked as a member of the Model Group the project database. |
Search Model Group
Total Number of Model Group Tests |
Click this button to display the total number of tests that have been added to the Model Group. |
Show All Tests |
Click this button to display all tests that have been added to the Model Group. |
Search ID |
Click this button to display only those tests in the Model Group whose ID contains the text entered in the search ID field. |
Search Notes |
Click this button to display only those tests in the Model Group whose notes contain the text entered in the Search Notes field. |
Fault Group Tab
Description
The Fault Group tab (see Figure 7) contains controls and settings for performing the following tasks:
-
Search for and view Fault Group test image sequences
-
Remove tests from the Fault Group
When the Fault Group tab is selected and image capture is not activated, test sequence images of the Fault Group test selected in the Fault Group table are displayed in the Main Image. Use the controls in the Image Sequence frame to view individual images in the sequence.
Figure 7: Fault Group Tab
Fault Group Table
Tip: To aid in searching for specific tests in the Fault Group table, rows can be reordered by clicking a column heading. Click the same column heading to alternate between ascending and descending order for the data in that column.
Resize Table |
Click this button to automatically resize the Fault Group table to fit the data in the ID and Tested columns. |
Remove Selected Test |
Click this button to remove the test selected in the Fault Group table from the Fault Group. Note: The selected test is not deleted, it is simply unmarked as a member of the Fault Group in the project database. |
Search Fault Group
Total Number of Defect Group Tests |
Click this button to display the total number of test that have been added to the Fault Group. |
Show All Tests |
Click this button to display all tests that have been added to the Fault Group. |
Search ID |
Click this button to display only those tests in the Fault Group whose ID contains the text entered in the search ID field. |
Search Notes |
Click this button to display only those tests in the Fault Group whose notes contain the text entered in the Search Notes field. |
Project Settings
The topics in this section explain how to set Thermal Model Comparison project settings.
Lockable Settings
Description
Program settings include all of the settings that can be changed by the user in the Thermalyze program. Lockable Settings specific program settings that cannot be altered after a Thermal Model Comparison project has been locked as changing their values would invalidate any test image sequences that have been saved to the project. Changes to Lockable Settings can only be made while the project is unlocked.
Settings accessible in the Test Time window
-
Test Time (Hours, Minutes, and Seconds)
-
Analysis Rate
Settings located in the Relay Setup window
-
Enable Relays
-
Enable Relay (individual relays 0 through 7)
-
Activate (individual relays 0 through 7)
-
Deactivate (individual relays 0 through 7)
Settings accessible in the Camera Control toolbar
-
Enable Image Averaging
-
Number of Images to Average
-
Camera Lens
-
Calibration Range
Settings located in the User Preferences window
-
Flip Captured Images Horizontally
-
Flip Captured Images Vertically
Settings located in the Image Processing window
-
Noise Reduction
-
Enable Image Smoothing
-
Image Smoothing Matrix Size
-
Image Smoothing Strength
Test Time
Description
The length of a Thermal Model Comparison (TMC) test and the frequency at which test thermal images are recorded and analyzed can be set in the Test Time window (see Figure 8). To open this window, click the Test Time item under the Setup menu (or press the button on the top toolbar) of the Thermal Model Comparison window.
Figure 8: Test Time window
Model Settings
Description
Settings that control how the Model is created and how tests are compared to the Model are accessible on the Model Settings window (see Figure 9). To open this window, click the Model Settings item under the Setup menu (or press the button on the top toolbar) of the Thermal Model Comparison window .
Model Creation settings determine how the model is generated. Model Comparison settings determine how tests are compared to the Model. The Model is comprised of a collection of image pixel acceptance ranges for each image in the test sequence. In a test comprised of 10 test images, for example, there is an acceptance range for every image pixel in each of the 10 images. When a test image sequence is compared against the Model, comparisons are made with the Model for each test image and for each image pixel. This enables the detection of thermal differences anywhere on a device at any time during the test.
Figure 9: Model Settings window
Model Creation
Small Group |
Select this option when creating the Model from the Model Group containing less than 10 tests. The pixel acceptance ranges a in the Model will include the range of thermal behavior of every device in the group. As a result, all tests in the Model Group will pass if they are compared against the Model. |
Large Group |
Select this option when creating the Model from the Model Group containing more than 10 tests. The pixel acceptance ranges in the Model are calculated by applying a number of standard deviations to the average of the Model Group tests. |
Spatial Tolerance |
Sometimes device features and components may not be aligned precisely from device to device. This may be due to a inconsistent manufacturing process or because of difficulty fixturing an unusual device configuration. In these cases, the Spatial Tolerance setting can provide compensation for slight misalignment from test to test. Severe misalignment however, should be rectified by improved device fixturing. Select the estimated misalignment in number of pixels. When the Model is created, each pixel acceptance range will include the thermal behavior of adjacent pixels within a radius of the Spatial Tolerance setting. Values range from 0 to 10 pixels. Important: Because Spatial Tolerance will reduce test sensitivity, non-zero values for this setting should only be used when properly aligning devices is not possible. |
Model Comparison
Small Group Sensitivity |
Because a small group model is based on the thermal behavior of only a small number of operational devices, the pixel acceptance ranges in the Model must be expanded to accommodate natural variances inherent in larger numbers of devices. The Small Group Sensitivity setting is used simulate a larger Model Group by expanding each pixel acceptance range in the Model. Decrease this setting to expand the pixel acceptance ranges in the Model in order to ignore small differences from the Model. Increase this setting to reduce the pixel acceptance ranges in order to detect small differences from the Model. Values range from 1 to 10. When a value of 1 (lowest sensitivity) is entered, the maximum expansion is applied to the pixel acceptance ranges in the Model. When a value of 10 (highest sensitivity) is entered, no expansion is applied. |
Large Group Standard Deviation |
Select the number of standard deviations to apply to the average of the tests in the Model Group for each pixel acceptance range in the Model. Increase this setting to expand the pixel acceptable ranges in the Model to ignore small differences from the Model. Decrease this setting to reduce the pixel acceptance ranges in the Model to detect small differences from the Model. Values range from 1 to 6 standard deviations. |
Image Noise Level |
This setting is used to compensate for image noise by expanding the pixel acceptance ranges in the Model. Image noise can be due to thermal reflections from nearby heat sources such as machinery, lighting, heating ducts, and human movement. Image noise can also be caused by air currents and device handling. Increase this setting to expand the pixel acceptance ranges. Values range from 0.0 to 10.0°C. When a value of 0.0°C is entered, no pixel acceptance range expansion is applied to the Model. Tip: Because it is virtually impossible to eliminate all sources of image noise, it is usually helpful to set the Image Noise Level to an initial value such as 0.5°C and then adjust this value when required. |
Fail Site Radius |
When comparing test sequence images against the Model, there may be groups of pixels that fall outside the pixel acceptance ranges. This settings designates the minimum radius of such a group to qualify as a fail site. Groups with radius smaller than this setting will not be identified as fail sites on comparison sequence images. Increase this setting to reduce the number of fail sites. Reduce this setting to increase the number of fail sites. Values range from 0 to 20 pixels. Note: A value of 0 allows single pixel fail sites to be identified. |
Password Protection
Enable |
Check this box to require the user to enter the correct password before changing any Model Creation or Model Comparison settings. Important: Password protecting model settings prevents unauthorized users from altering settings that could change the model or change how devices are compared against the Model. |
Change Password |
Click this button to open the Change Password window. The password can be zero to eight alphanumeric characters (letter and numbers only). Note: The default password is "". |
Overheat Protection
Description
During a Thermal Model Comparison (TMC), overheating may occur due to a short circuit or component degredation. Overheat protection can prevent sensitive components from being damaged. During a TMC test, if the temperature threshold is reached anywhere on the device, the test is immediately stopped and all relays are deactivated.
Settings that control overheat protection are accessible on the Overheat Protection window (see Figure 10). To open this window, click the Overheat Protection item under the Setup menu (or press the button on the top toolbar) of the Thermal Model Comparison window.
Figure 10: Overheat Protection window
Enable |
Check this box to enable overheat protection during TMC tests. When enabled, all relays in the relay device are immediately deactivated when an area on the device reaches the Threshold setting and before the Test Time setting has elapsed. Note: If any regions exists, only areas enclosed by Regions are evaluated for overheating. If no Regions exists, then the entire image is evaluated. |
Threshold |
Enter the short circuit detection temperature. Values range from 1 to 100°C. |
Test Time |
Enter the length of time at the beginning of Model Comparison tests that images will be evaluated for short circuits. Important: After Test Time has expired, short circuits will not be detected even if the Threshold setting has been exceeded. |
Positioning Blocks
Description
The positions of blocks can be recorded and saved using the Positioning Blocks Fixture window (see Figure 11). To open this window, click the Positioning Blocks item under the Setup menu (or press the button on the top toolbar) of the Thermal Model Comparison window. The window is also accessible by clicking the Positioning Blocks item on the Test Setup menu (or by pressing the button in the Test Setup section of the Shortcuts toolbar) on the main Thermalyze window.
Figure 11: Positioning Blocks window
Positioning Blocks
All positioning blocks are identical but they can be mounted in two different ways: as fixed blocks (two thumbscrews) or adjustable blocks (one thumbscrews). As shown in Figure 11, fixed blocks are typically mounted on the left and top edges of the board to designate the position of the board. Adjustable blocks are usually mounted on the right and bottom edges of the board to push the board tightly against the fixed blocks for precise board registration. The positions of fixed blocks and adjustable blocks can be interchanged however, to accommodate different board and component configurations.
Selecting and Moving Blocks
Click a block to select it. Selected blocks have a green border. Click and drag blocks to move them.
File
Click this button to open the Open Positioning Blocks dialog. Select the ASCII text file (.orgn) to open. Note: The file name of the most recent Positioning Blocks file opened or saved is displayed at the bottom of the window. |
|
Click this button to open the Save Positioning Blocks dialog. When saving a Positioning Blocks file, the file name that you provide is appended with ".opos" and saved in the "Optotherm\Thermalyze\Positioning Blocks" folder unless you specify a different folder. Tip: Positioning Blocks files can be saved and then later opened and incorporated into new projects when testing boards with similar layouts. |
|
Export |
The Positioning Blocks window can be saved to file in the following formats: bmp, jpg, png, and tif. Note: Exported Positioning Blocks windows are saved in the "Optotherm\Thermalyze\Export" folder unless you specify a different folder. |
Print |
Print the Positioning Blocks window. Note: You must have a printer connected to your computer. |
Print with Preview |
Select this menu item to open the Print Preview dialog before printing. |
Board Position Controls
Click this button to add a fixed block using two thumbscrews. |
|
Click this button to add an adjustable block using one thumbscrew. |
|
Click this button to rotate the currently selected block clockwise. |
|
Click this button to remove the currently selected block. |
|
Click this button to remove all blocks. |
Instrument Control
Description
To open the Instrument Control window (see Figure 12), click the button on the Sequence toolbar, of select the Instrument Control item from the Setup menu (or click the
button on the top toolbar) of the Thermal Model Comparison window or Lock-in Thermography window.
Figure 12: Instrument Control window
Setup Menu
Open the Instrument Settings window. |
|
Open the Instrument Startup window. |
|
Open the Instrument Safety window. |
Instrument Selection
Detect Instruments |
Click this button to detect the instruments connected to the computer and list them in the Select Connected Instrument combo box. The instruments that can be detected are Keithley USBTMC (USB test and measurement class) via USB instrument interface and USBTMC via GPIB instrument interface. |
Select Connected Instrument |
Select a connected instrument to use. Instruments with a USB interface are listed, for example, as "USB0::0x05E6::0x2612::4482980::INSTR" where "USB0" designates the USB interface, "0x05e6" designates the Keithley Instruments vendor ID, "4482980" designates the instrument serial number, and "0x2612 designates the model number, and "INSTR" designates the USBTMC protocol. Instruments with a GPIB interface are listed, for example, as "GPIB0::18::INSTR" where "GPIB0" designates the GPIB interface, "18" designates the GPIB address selected on the instrument, and "INSTR" designates the USBTMC protocol. |
Open Connection |
Click this button to establish a connection with the selected instrument. Note: A connection to the instrument must be opened before the instrument can be controlled. |
Enable Instrument |
Check this box enable the instrument for use during Sequence, TMC, and LIT tests. Note: When this box is checked, the instrument output is used in place of Sequence Relays, TMC Relays, and LIT Relays. Important: The Instrument Control window must be open to issue commands to the instrument during Sequence, TMC, and LIT tests. |
Instrument Information
Select Model |
Select the model that is selected in the Select Connected Instrument combo box. |
Command Protocol |
Select the instrument communication protocol, SCPI (Standard Commands for Programmable Instruments) or TSP (Test Script Processing). SCPI is an older generic command set supported by older instruments. TSP is a Keithley proprietary command set supported by newer Keithley instruments. Not all Keithley instruments support both protocols. Important: The protocol will also need to be selected on the instrument. On a 2450 graphical series source measure unit, the communication protocol can be found under System >> Settings >> Command Set. |
Source Settings
Sourcing |
Choose to source either voltage or current. |
Channel |
Choose the instrument channel to use. |
Setup Instrument |
Click this button to setup the instrument with the selected source and measure settings. When instrument settings are changed, the message "* Setup Required" is displayed to the right of the button until the instrument has been setup with the new settings. Note: The instrument must be setup before a test can be started. |
Source Voltage
Voltage High |
Enter the output voltage during a Sequence or TMC test, and the first half of each cycle of a lock-in test. Note: The voltage measurement range is determined by the larger absolute value of the Voltage High and Low settings. |
Voltage Low |
Enter the output voltage during the second half of each cycle of a lock-in test. |
Current Limit |
Enter the maximum output current. Note: This value may be automatically limited to the Safety Current Limit if the source voltage exceeds the Safety Voltage Limit. Important: When Current Limit is exceeded during a test, instrument output response may slow significantly as output voltage is reduced to lower current below the limit level. This may result in voltage and current readings that do not reflect values when the output settles. |
Current Range |
Enter the current measurement range. When testing circuits with high capacitance or inductance, setting the range higher than Current Limit can help prevent damage to instrument output circuitry. To set the current measurement range equal to Current Limit, set this value lower than Current Limit. Note: Some instrument models require Current Limit to be within a specific percentage of Current Range and will issue an error if this value is set too high. |
Test |
Check a box to turn on and test the instrument high or low voltage output. Uncheck a box to turn off the output. These boxes are used during test setup to confirm electrical connection and to evaluate power and resistance dissipation in a defect to estimate required testing time. |
Source Current
Current |
Enter the output current during a Sequence or TMC test, and the first half of each cycle of a lock-in test. Note: The current measurement range is determined by this setting. |
Voltage Limit |
Enter the maximum voltage output. Note: This value may be automatically limited to the Safety Voltage Limit if the source voltage exceeds the Safety Current Limit. Important: When Voltage Limit is exceeded during a test, instrument output response may slow significantly as output current is reduced to lower voltage below the limit level. This may result in voltage and current readings that do not reflect values when the output settles. |
Voltage Range |
Enter the voltage measurement range. When testing circuits with high capacitance or inductance, setting the range higher than Voltage Limit can help prevent damage to instrument output circuitry. To set the voltage measurement range equal to Voltage Limit, set this value lower than Voltage Limit. Note: Some instrument models require Voltage Limit to be within a specific percentage of Voltage Range and will issue an error if this value is set too high. |
Test |
Check a box to turn on and test the instrument current output. Uncheck a box to turn off the output. This boxes is used during test setup to confirm electrical connection and to evaluate power and resistance dissipation in a defect to estimate required testing time. |
Measurement Readings
Timed: Enable |
Check the Enable box to conduct real-time instrument measurements on the instrument front panel which are then displayed at the bottom of the Instrument Control window. Timed readings are performed when a voltage or current test box is checked and during Sequence and TMC tests. Important: When controlling an instrument remotely via USB or IEEE-488, continuous measurement triggering enables the instrument to conduct measurements continuously when the Test box is checked (see above). If this feature is not available on your instrument, then the Timed: Enable box must be checked to conduct measurements while testing the instrument output. See the Instrument Model Feature Support table to determine if your instrument support continuous measurement triggering. |
Reading Rate |
Enter the real-time instrument read frequency. |
Lock-in Test: Enable |
Conduct real-time instrument measurements on the instrument front panel which are then displayed at the bottom of the Instrument Control window. LIT Test readings are performed during LIT tests after changes in output. Note: Measurements can be read from the instrument during LIT tests only when Cycle Frequency is 1Hz and lower to prevent disrupting cycle test timing. |
Read Delay |
This settings determines when current and voltage are read from the instrument after a change in output during a lock-in test. Increase this setting to delay current and voltage measurements, allowing adequate time for the instrument output to settle and to conduct measurements. Note: Changes in output voltage and current and their measurement are very fast but not instantaneous. Additionally, the magnitude of voltage and current output and settings including High and Low, High Capacitance Mode, Current Limit, NPL Cycles, and Auto Zero determine instrument output settling time and measurement speed. |
Take Reading |
After checking a voltage or current test box to turn on instrument output, click this button to read and display the output current and voltage until the instrument output has settled. |
Send Values |
Click this button to send the values in the Current, Voltage, Power, and Resistance fields to the status bar in the Lock-in Thermography window and to the Lock-in Advanced Save and Export window. |
IV Curve |
Click this button to open the IV Curve window. |
Current [mA] |
Display field for the measured current in units of milliamps. |
Voltage [V] |
Display field for the measured voltage in units of volts. |
Power [µW] |
Display field for the power dissipation in the resistive short circuit of leakage current site in units of microwatts. Power is calculated by multiplying the measured current and voltage. Tip: Power dissipation in a resistive short is used to estimate the LIT test time required to detect the defect. |
Resistance [Ω] |
Display field for the resistance of the short circuit of leakage current site in units of ohms. Resistance is calculated by dividing the measured voltage by the measured current. |
Relay Setup
Description
Relay are used to control voltage to a device during a Thermal Model Comparison (TMC) test. To open the Relay Setup for Thermal Model Comparison Tests window (see Figure 13), select the Relay Setup item from the Setup menu (or press the button on the top toolbar) of the Thermal Model Comparison window .
Don't Move: Refer to the relay device manufacturer documentation for detailed specifications and instructions regarding making proper electrical connections.
Figure 13: Relay Setup for Thermal Model Comparison Tests window
Relay Setup
Relay Device Type |
Choose the relay device that is connected to your computer. |
Initialize Selected Device |
Click this button to detect and establish communication with the selected relay device. |
Enable Relays |
Check this box enable all relays in the selected device. Important: This box must be checked in order for any relays to be activated. |
Relay (0 to 7) |
There are 8 relays (numbered 0 through 7) that can be used to control voltage to a device under test. Each relay can be programmed to activate and deactivate at a specific time during a TMC test. To enable a relay, check its corresponding box. |
Activate (0 to 7) |
Enter the time (in seconds) to activate the relay. The time is measured from the start of a TMC test. Note: The response time of activating and deactivating relays may be different for each type of relay device. |
Deactivate (0 to 7) |
Enter the time (in seconds) to deactivate the relay. The time is measured from the start of a TMC test. Important: All relays are automatically deactivated when a TMC test ends or is cancelled, even if the deactivate time has not been reached. |
All On |
Click this button to activate all enabled relays. |
All Off |
Click this button to deactivate all enabled relays. |
Status (click to test) |
Check a box to manually activate a relay. Uncheck the box to deactivate a relay. Mechanical and reed relays will make a soft click when activated or deactivated. Solid state relays will be silent. |
Project Notes
Description
Enter device information, powering instructions, and troubleshooting details in the Project Notes window (see Figure 14). To open the Project Notes window, click the Project Notes item under the Setup menu (or press the button on the top toolbar) of the Thermal Model Comparison window.
Figure 14: Project Notes window
Project File Structure
Description
When a new project is created, the project file, project folder, and database file are created automatically (see Figure 15).
Project File
The project file (see Figure 15) is created and saved in the default directory "Optotherm\Thermalyze\Thermal Model Comparison" by appending .otmc to the project name entered when creating the project. Project files contain all program settings that are associated with Thermal Model Comparison tests.
Figure 15: Project Files Structure
Project Folder
The project folder is created in the default directory "Optotherm\Thermalyze\Model Comparison" with the same name as the project name. All files associated with the project (except for the project file) are stored in this folder.
Project Database File
An empty database file named "TMC.mdb" is saved in the project folder. This database file will contain a list of all test IDs, creation date, and notes. It will also identify which tests have been added to the group of Model Group and Fault Group. When a project is opened, all tests that have been added to the Model Group are loaded into memory to speed Model creation. Tests that have been added to the Fault Group, however, are not loaded into memory as the possibility of large numbers of tests could result in slow computer performance.
The database file is automatically updated as changes are made to a project.
Project Regions File
Regions that have been drawn on the thermal image are saved in binary files named "regions.orgn" within the project folder. Project Regions are saved upon project creation and when the Save Project Regions item under the File menu is clicked on the Thermal Model Comparison window. Saved Regions are automatically opened and displayed when the project is opened.
Project Notes File
Notes entered into the Project Notes window are saved in binary files named "notes.onts" within the project folder. Project notes are saved upon project creation and when the Save Project Settings item is clicked under the File menu on the Thermal Model Comparison window.
Project Overlay File
An overlay can be associated with the project and saved with the name "overlay.png" in the project folder. The project overlay is saved upon project creation and when the Save Project Overlay item under the File menu is clicked on the Thermal Model Comparison window.
Test History Folder
The Test History folder is created in the project folder sfter the project has been locked and a new device has been tested. Test image sequences are automatically saved in this folder at the end of each test. The saved sequence file and folder are given the same name as the test ID. The sequence folder contains all of the individual thermal images in the sequence.
Project Setup
The topics in this section explain how to setup a Thermal Model Comparison project.
Start a New Project
Create a New Project
-
Open the Thermal Model Comparison window.
-
Click the create project button
item on the top toolbar, enter a name for the new project, and then click Save.
Test Time Settings
-
Enter a value for Test Time that is equal to the time required to electrically activate the devices. When testing device without a boot up process, set the test time equal to the time required for the majority of device components to reach approximately 50% of their steady-state temperature.
-
Select an Analysis Rate that results in 30 or fewer test images.
-
On the Thermal Model Comparison window select File >> Save Project Settings to save these settings to the project folder.
Lockable Settings
The following Lockable Settings must be set before the project can be locked.
-
Enable Image Averaging and set the number of images to average to 32 images. Higher settings can be used to reduce image noise for a long Test Time. Lower settings should be used for short Test Times requiring faster temperature response.
-
Make sure the currently installed Camera Lens is selected.
-
Set Calibration Range to the lowest range for the currently installed lens to maximize measurement sensitivity. Higher calibration ranges should only be used when device temperatures exceed the low range during the test.
-
Make sure the Flip Image Horizontally and Vertically boxes are set appropriately to view thermal images in the correct orientation. Typically, both boxes are checked when the camera is mounted looking down with the top of the camera facing the operator.
-
Make sure that Noise Reduction and Image Smoothing are disabled.
-
On the Thermal Model Comparison window select File >> Save Project Settings to save these settings to the project folder.
Other Program Settings
-
Press the Spectrum Color Palette button
.
-
Activate Image Subtraction. Note: When activated, there is a gray zone between the green and blue colors of the spectrum palette that represents a temperature change of zero.
-
Set the Palette Min and Max values to 0.00°C and 20.00°C, respectively, and then click the Set button to set the image subtraction default max and min values. Note that these values can be adjusted to accommodate the specific heating ranges of each project.
Program Settings: After creating and setting up a new project, save a Program Settings file so that you can recall all of the settings if you need to create a similar project in the future.
Device Position
Procedure
Follow this procedure to position the device using the Positioning Blocks:
-
Using a device as a reference, position the blocks so that the device is centered in the image.
-
Open the Positioning Blocks window and record the positions of the blocks.
-
On the Thermal Model Comparison window select File >> Save Project Settings to save these settings to the project folder.
Camera Position
Description
If your system includes an automated vertical camera stage, follow these directions to set the height of the camera:
-
Open the Linear Stage Control window.
-
Press the Connect button, and then select Online Control Mode.
-
If necessary, home the linear stages.
-
Move the camera height to an appropriate height so that the device is in view and then focus the camera lens.
-
On the Thermal Model Comparison window select File >> Save Project Settings to save these settings to the project folder.
Device Power Control
Relay Setup
-
Make sure that the device power supply is unplugged and unpowered.
-
Connect one of the device power supply leads through a relay in the relay device. Devices requiring multiple power levels will require using multiple relays. In order to test complex devices, a functional tester may be required to exercise all device components. Synchronizing a Thermal Model Comparison test with a functional tester can be accomplished by sending a signal to the functional tester at the start of a test using one of the relays.
-
Open the Relay Setup window, check the Enable Relays box, and then check each relay that will be used.
-
Enter the appropriate Activate and Deactivate times for each relay. Typically the Activate time is set to 0.0 and the Deactivate time is set equal to the Test Time so that the device is powered for the duration of the test.
-
On the Thermal Model Comparison window select File >> Save Project Settings to save these settings to the project folder.
Relay Info: Refer to the relay device manufacturer documentation for detailed specifications and instructions regarding making proper electrical connections.
Lock a Project
Description
While a project is open, changes to Lockable Settings can only be made while the project is unlocked. Once a project file is locked, it cannot be unlocked. This feature prevents inadvertent changes to settings that can invalidate prior test data. Project files should only be locked after test setup is complete and no further adjustments need to be made to lockable settings.
Relay Info: Test sequences are automatically saved at the completion of each test only if the project is locked.
On the Thermal Model Comparison window, click the lock project button on the top toolbar to lock the project.
Locked Project: Once a project is locked, it cannot be unlocked.
Test Regions
Description
Regions can be used to define specific areas on the device to evaluate. There may be areas on the device, such as power leads or connectors, that should be excluded as they may cause false test failures due to high or inconsistent heating. When one or more Regions exist, fail sites are evaluated only in areas enclosed by Regions.
Regions: If no Regions exist, then the entire image is tested.
-
Draw one or more Regions to enclose areas on the device to evaluate.
-
On the Thermal Model Comparison window select File >> Save Project Regions to save the Regions to the project folder.
Model Setup
The topics in this section explain how to create a Thermal Model Comparison model.
Add Tests to Model Group
Procedure
Follow these directions to test operational devices and then add them to the Model Group:
-
Start Capturing Images.
-
Position an operational device using the Positioning Blocks.
-
Select the New Test tab in the Thermal Model Comparison window.
-
Enter a unique ID for the device and type in "Operational" in the Notes text box.
-
Wait for the value in the Image Stability field to stabilize after handling the device.
-
Press the ON/OFF button to start the test. The time remaining in the test is displayed to the right of the ON/OFF button. Note: You can press the Cancel Operation button to manually stop the test at any point.
-
When the test ends, the following message is displayed: "Cannot compare the loaded test to the model because no tests have been added to the model group." Click OK to close the message box.
-
Click the Add to Model Group button to add the loaded test to the Model Group.
-
Repeat this procedure to test and add another test to the Model Group.
Don't Move: The camera and device position must be remain fixed for all tests within the same project. If the camera or device position changes, devices under test will not align properly with Model Group tests and therefore, model comparisons will not be valid.
Create the Model
Model Creation Settings
-
Open the Model Settings window.
-
Select the Small Group option (or select the Large Group option when the number of tests in the Model Group will be 10 or more).
Create the Model
-
Select the Model Group tab.
-
Click the Create Model button to create the Model.
Model Comparison Settings
-
Open the Model Settings window.
-
Enter an initial value for Small Group Sensitivity, such as 8 (or Large Group Std Dev, such as 4).
-
Enter an initial value for Image Noise Level, such as 0.50°C.
-
Enter an initial value for Fail Site Radius, such as 2 pixels.
Testing
The topics in this section explain how to perform a Thermal Model Comparison test and locate potential fail sites.
Conduct a Test
Test a Device
-
Position a device using the Positioning Blocks.
-
Start Capturing Images.
-
Wait for the value in the Image Stability field to stabilize after handling the device.
-
Press the ON/OFF button to start the test. The time remaining in the test is displayed to the right of the ON/OFF button. Note: You can press the Cancel Operation button to manually stop the test at any point during the test.
-
When the test ends, the Test Results tab is automatically selected and a comparison sequence is created by comparing each image in the test sequence to a corresponding set of pixel limits in the Model. When a pixel value in the test sequence image falls within the Model high and low limits, the corresponding pixel in the comparison sequence is set equal to zero. When a pixel value in the test sequence image falls outside the Model limits, the corresponding pixel in the comparison sequence is set to the difference between the pixel value and the closest limit.
-
The PASS/FAIL results of the test are displayed on the Test Results tab.
Re-comparing Defective Devices
If a known defective device is tested and passes, model creation and comparison settings can be adjusted in order to detect and locate smaller deviations from the the Model. Note that when re-comparing a test, it is not necessary to power the device again. The test image sequence loaded in memory can be compared to the Model with new model comparison settings or after recreating the Model with new model creation settings.
Change the Model Comparison Settings
-
Decrease Small Group Sensitivity (or Large Group Std Dev) to reduce the ranges of pixel acceptance limits.
-
Decrease Image Noise Level to reduce the ranges of pixel acceptance limits.
-
Decrease the Fail Site Radius to detect smaller fail sites.
-
Click the Compare to Model button to recompare the loaded test sequence to the Model.
Change the Model Creation Settings
-
Decrease Spatial Tolerance to reduce the number of adjacent pixels used to determine the range of pixel acceptance limits.
-
Click the Create Model button to recreate the Model based on the changed setting.
-
Click the Compare to Model button to compare the loaded test sequence to the Model.
Locate Fail Sites
Identify Fail Sites
-
Select the Test Results tab.
-
Set Number Shown to the maximum number of fail sites to display.
-
Drag the Sequence Slider to scan through the comparison sequence images, locating the image with the most fail sites. Often, the last image in the sequence will contain the most fail sites.
-
If the number displayed in the Image Fail Sites field is greater than 20, increase the Fail Site Radius to decrease the number of fail sites. Note: You can also reduce other Thermal Model Comparison settings to decrease the number of fail sites.
-
Press the Auto button or manually adjust the Palette Max and Min values to view the fail sites that have been detected. Note: If the Auto Scale Region box is checked and a Region is selected, pressing the Auto Scale button results in scaling the palette to the maximum and minimum values within the selected Region.
-
If more than one fail site is displayed, you can show each fail site individually or all of them at once using the Previous (<), Next (>), and Show All buttons.
-
Draw one or more Regions to mark the locations of the fail sites on the Main Image.
Locate Fail sites on the Device
-
Disable Image Averaging and deactivate image subtraction in order to view real-time thermal images of the device.
-
Move the tip of your finger or a non-conductive pointer over the device until it is aligned with the Regions drawn that mark the fail sites. Alternatively, you can use the Overlay software tool to locate the fail sites on the device.
Fault Group Search
Add a Fault Description
-
After a defective test device has been successfully troubleshooted, select the Test History tab.
-
Load the defective test device.
-
In the Notes field, enter a description of the failure mode and the repair details.
-
Press the Save Changes button to save the notes.
Add to Fault Group
-
Click the Add to Fault Group button to add the loaded test to the group of Fault Group.
Search Fault Group
-
After a test has ended and failed, make sure the Test Results tab is selected and press the Search Fault Group button. When the search has completed, similar behaving devices are listed in order from most to least similar.
-
Use the notes in the list to help troubleshoot the device.