A test engineer is a professional who determines how to create a process that would best test a particular product in manufacturing and related disciplines, in order to assure that the product meets applicable specifications. Test engineers are also responsible for determining the best way a test can be performed in order to achieve adequate test coverage. Often test engineers also serve as a liaison between manufacturing, design engineering, sales engineering and marketing communities as well.
Test engineer expertises
Test engineers can have different expertise, which depends on what test process they are more familiar with (although many test engineers have full familiarity from the PCB level processes like ICT, JTAG, and AXI) to PCBA and system level processes like board functional test (BFT or FT), burn-in test, system level test (ST). Some of the processes used in manufacturing where a test engineer is needed are:
- In-circuit test (ICT)
- Stand-alone JTAG test
- Automated x-ray inspection (AXI) (also known as X-ray test)
- Automated optical inspection (AOI) test
- Center of Gravity (CG) test
- Continuity or flying probe test
- Electromagnetic compatibility or EMI test
- (Board) functional test (BFT/FT)
- Burn-in test
- Environmental stress screening (ESS) test
- Highly Accelerated Life Test (HALT)
- Highly accelerated stress screening (HASS) test
- Insulation test
- Ongoing reliability test (ORT)
- Regression test
- System test (ST)
- Vibration test
- Final quality audit process (FQA) test
Early project involvement from design phase
Ideally, a test engineer's involvement with a product begins with the very early stages of the engineering design process, i.e. the requirements engineering stage and the design engineering stage. Depending on the culture of the firm, these early stages could involve a Product Requirements Document (PRD) and Marketing Requirements Document (MRD)—some of the earliest work done during a new product introduction (NPI).
By working with or as part of the NPI group, a test engineer ensures that a product is designed for both testability and manufacturability. In other words, to make sure that the product can be readily tested and built.
The following are some general rules to ensure testability and manufacturability of a product:
- Making sure the product has correct label specs and placement that would make it possible for the unit to be traceable and programmable. Implementing good label specs results in having correct information programmed correctly into the unit under test (UUT) (sometimes called DUT or device under test). To make this possible, the test engineers enforce those labels location and are all readable and scannable, thus eliminating the need for a manual typing of information into the unit. Automatic placing of identification codes into the part during test and making them available for verification at later processing steps can help minimize these types of errors. Manual typing can introduce problems related to inaccurate information being programmed due to human errors. Also, without the test engineers input during PRD design phase, the hardware engineer in charge of designing of the silk-screen for the PCB may put those labels below some attachable board which will then later renders the labels useless (i.e. in a motherboard/daughterboard design and also a board that has a pluggable module, a label would be visible on the main board by itself but would be obstructed by the other boards that needs to be integrated). This information are often indicated in both PRD and MRD.
- Making sure that all components required to test and debug the UUT, which includes the console/serial port, are all accessible from the early part of the manufacturing process up to the last part which is often the final quality audit/assurance (FQA) process. This also includes making sure those components are available even after the units are returned by the customers for troubleshooting or repair. By following this guidelines, the team will eliminate unnecessary opening of the UUT just to access those components which may result in introducing errors into the unit (i.e. knocking off some capacitors or resistors when opening/sliding out the cover, dropping the tool inside the PCBA after opening, forgetting some other cables to reconnect before closing the unit for manufacturing process flow continuation, etc.).
- Making sure that all components needed to test the unit are added into the cost matrix of the final product. This components may include the UART/RS232 chips for talking to the UUT, Ethernet ports for upgrading the firmware, JTAG connectors, etc.
- Defining what manufacturing test process is needed based from the product definition.
- Verifying that the currently available test equipment is adequate for testing the proposed design. If new equipment is needed, budgetary concerns have been addressed and sufficient lead time exists for new equipment installation and verification. Also, new test equipment may require training for test equipment operators and supervisors.
By following the general rules above, test engineers minimize future surprises (like adding extra components, re-layout of the boards, etc.) which drives up costs and development delays of the final product.
Working with cross platform teams, hardware and software team
Often people take shortcuts to be able to deliver final products. Because of these shortcuts, the product's manufacturability and testability becomes complicated (inability to read and write information, creating deviation from the process, etc.) which impacts the manufacturing complexity of a product. Because of this complexity, bottlenecks in the manufacturing and delivery schedule delays are introduced.
With this in mind, test engineers always get involved in the following reviews as well:
- Schematics review - to make sure all components and data/electrical paths are accessible and testable
- Board layout review - to make sure all labels and components are accessible. No components are near the edges, covers, movable parts, etc. that would result into higher probability of a components being knocked off the board.
- Electrical specifications review - to make sure all we can drive the needed power into the board with any fixture needed in any of the process (ICT fixture needs to make sure it can supply the appropriate power to the board without external power supplies, the Burn-In and ESS chamber can provide the required voltage and current to a number of fixtures and at the same time without modifying the chambers specs so it can mix with other products)
- Diagnostics specifications review - to make sure command output formats are followed for simplification of whatever test automation tools will be developed. Also, to make sure that the commands themselves are available to test all components.
Products' yield plays a very important part during their lifespan. There are usually three stages for a product, engineering, initial production (IP) and full production (FP).
- In early stages, engineering, production yield fluctuates a lot. The manufacturing process is under debugging and optimising. Foundry engineers usually work with factories to drive the yield of the product. Most companies set specific yield targets for each process to hit the expected yields.
- Once the product yield is stable, usually 80%, the test engineer is responsible for advancing the product from engineering stages to initial production stages. During this period, the test engineer will monitor the production yield for a period of time, change the test program limits and even work with foundry engineer to further improve the yield.
- Once the production yield is above 90%, the test engineer can turn on full production for this product, and will continue to monitor and improve production yield.
In addition, yields will show if another process needs to be introduced (e.g., because processes already used cannot capture certain test errors). Yields can also decide if an existing test process can be trimmed down (step-wise or time-wise) or even fully eliminated. E.g., if the ESS errors can be captured during the 3rd hour, test time can be cut down from a normal 24 hours down to maybe 4. Or if a process consistently yields 100% during a 15-month period, teams can get together and decide to eliminate that process at all.
Test automation refers to the automation of the process to test a product through the use of machines. Depending on the product, the machines that we are referring to could mean a combination of Automatic Test Equipment (ATE), handler, interface board, and test program that drives the ATE, as with the case of the IC chip testing.
Test automation is a big part of a test engineer's job.
The whole intention of automating the test is as follows:
- Enforce test steps to be followed within specifications and correct timing.
- Eliminate manual command and data inputs.
- Automate data gathering.
- Enforce test process flow.
Overall, this drives manufacturing reliability and quality at the end of the line making sure that all units shipped out to customers are well tested, stressed, filtered out of any errors, and configured properly.
Defining standard test documents
Following are some of the documents that the test engineers maintain or define:
- Test method
- Diagnostic design specification
- Manufacturing test requirement design specification
- Design for testability (DFT)
- Design for manufacturability (DFM)
- Test plan
- Acceptance test procedure
A contract manufacturer (CM) also provides a test engineer for their customers. The function of these test engineers varies depending on the level of support they provide for their customers: providing "interactive and first level of defense"-only support or providing partial or ground-up solutions.
Providing interactive and first level-of-defense support
Providing "interactive and first level-of-defense"-only support is the usual job of the CM TE. Here are some typical job functions for a CM test engineer:
- Reviewing test solutions with their partnering test engineers from the customer side.
- Analyzing if the infrastructure meets the requirements (from floor/line setup, network access to workstations and/or servers, operator manpower, etc.).
- Getting familiar with the customer products' technology.
- Being able to manage, train and support operators who performs the actual testing.
- Being able to debug and isolate problems.
- Gathering information to feed back to their partners.
Because of their close involvement with the test line, they monitor the products going through the line and inspect the failed boards to decide if it really failed or if the failure was just caused by some improper test setup. Some examples of these false failures are:
- Forgot to connect the cable to talk to the UUT (or misplacing the cable or putting it loose). This will cause the test automation to time out for any response from the UUT.
- Forgot to connect the loopback cables when testing a UUT with any networking interface (Ethernet/optical/etc ports). This will cause the traffic test to fail.
- Skipped some test process. Some test process will configure the UUT to load some firmware or put it in some state (i.e. preparing it to run in burn-in mode) so when the test automation starts, whatever known state it is expecting will not be satisfied and thus fail.
- Skipped to implement some deviations that would require hardware/software changes to the UUT.
- Forgot to power up the unit right away when the test automation started. This will result to the same problem as the first item of this list.
- Forgot to attach any other test fixture components.
Providing partial or ground up solutions
There is a small number of companies who prefer to outsource their test engineering work to their corresponding CM. In that case, the CM TEs will be in charge of providing the test automation solution, test fixture design, yield gathering plus the usual interactive and first level of defense for their customers.
Of course, outsourcing test solutions to the CM has its pros and cons.
Some of the advantages are:
- Cheaper cost. Especially if the CM resides in a country where labor is at minimum.
- Beneficial if the company itself doesn't have or cannot find any TE that matches the company's requirements.
Some of the disadvantages are:
- Getting tied up to a single CM. It is hard to find a CM that is willing to share information to another CM.
- CM TEs are seldom involved with product design stage/phase.
- Time constraints. They only get handed out the specs of the product during late NPI stage. Because of this, test solutions are rushed and quality are often compromised.
- Conflict of interest. Company needs to know every level of information that goes through the product line in order to monitor potential problems that would one day snowball. But CM doesn't provide this level of details, they only give out how many units passed or fail for the day. A unit could have failed 5 times before it passes which may relates to some timing issues of some components of the product like the CPU or oscillators for example. The cleaner the first passed yield data that the CM provides, the better quality the unit went through the assembly line. This means that the CM would be enticed to provide the final result as their first passed yield data instead so it will reflect their higher quality side.
Because it is hard to find a test engineer who knows every aspect of testing methodology (from PCB tests like ICT, JTAG test, flying probe test, and X-Ray test to PCBA test which includes writing test automation from functional test to FQA test among others), companies usually outsource part of the development of this missing test piece to their CM. For example, if none of the in-house TEs know much about ICT fixtures, they will ask their CM to develop the ICT test solutions for them instead.
- "威尼斯欢乐娱人城2299-首页". www.tcdmsecurity.com. Archived from the original on September 25, 2008.
- "Silicon Test and Yield Analysis - White Papers". Archived from the original on 2011-07-20.
- "中国电子制造 EMAsia-China.com". Emasiamag.com. Archived from the original on 2012-02-17. Retrieved 2014-01-25.
- "Cost-Benefit Analysis of Test Automation". StickyMinds. 2000-11-17. Retrieved 2014-01-25.