AI vision inspection systems are only as accurate as the maintenance routines behind them. Camera drift, lighting degradation, contaminated lenses, and AI model decay are silent — they erode inspection accuracy gradually until a defect escapes, a line stops, or a recall is triggered. OxMaint's AI Vision Inspection Integration automates every PM task across your vision systems — camera calibration, lighting validation, model drift review, and sensor cleaning scheduled and logged so accuracy never silently degrades. Book a free demo to see automated vision system PM in action.
PM Coverage Across the 4 Vision System Zones
Every zone in an AI vision inspection system generates maintenance data — calibration records, lighting measurements, model performance logs, and connectivity checks. Without structured PM, accuracy degrades silently. OxMaint captures every task at completion — traceable per camera, per shift, and per line, instantly available for food safety, GMP, and quality audit review.
Before You Start: Configure Your Vision System PM Schedule
Effective vision system PM starts before the first shift. Teams must register every camera, lighting unit, and sensor with asset IDs, set PM intervals per system criticality, assign calibration responsibilities, and establish baseline performance benchmarks — so every scheduled task is meaningful and every deviation from accuracy baseline is immediately flagged. OxMaint's setup wizard configures your full vision system PM schedule in under a day.
| # | Task | Schedule | Acceptance Criteria | Sign-Off | |
|---|---|---|---|---|---|
| 1.1 | Lens surface inspected for dust, condensation, smear, or product build-up using a bright light source before line start. Clean with approved optical cloth if any contamination found. | Daily | Lens confirmed optically clear. No visible contamination under inspection light. Cleaning logged if performed. | ________ | |
| 1.2 | Camera housing and protective glass inspected for cracks, condensation on inner surface, or loose seals that could allow ingress during production or CIP cycles. | Daily | Housing sealed with no cracks or condensation. Any seal breach triggers immediate maintenance hold. | ________ | |
| 1.3 | Live image quality confirmed against reference frame — contrast, sharpness, and field of view checked at the start of each shift using the system's built-in reference check or a standard test card. | Daily | Image matches reference within defined contrast and sharpness tolerance. Result logged with timestamp. | ________ | |
| 1.4 | Calibration target run and result compared against baseline accuracy. Focus and aperture checked against the last recorded settings. Any drift from baseline logged and escalated. | Weekly | Accuracy within ±2% of baseline. Focus confirmed locked. Drift above threshold triggers recalibration. | ________ | |
| 1.5 | Camera mounting bracket, angle, and standoff distance checked against installation spec. Vibration from nearby machinery can shift camera position over time — confirm with a physical measure. | Weekly | Mount angle and standoff distance within ±1mm of installation spec. Fastener torque confirmed. | ________ | |
| 1.6 | Full geometric calibration run using certified calibration target across the full field of view. Colour calibration verified for colour-sensitive inspection applications. All results recorded in CMMS asset history. | Monthly | Geometric distortion within OEM spec. Colour deviation below threshold. Full results archived in asset record. | ________ |
| # | Task | Schedule | Acceptance Criteria | Sign-Off | |
|---|---|---|---|---|---|
| 2.1 | Illumination level measured at inspection point with calibrated lux meter and compared against the baseline lux value recorded during system commissioning or last full calibration. | Daily | Measured lux within ±10% of baseline value. Readings below threshold trigger lighting maintenance before production begins. | ________ | |
| 2.2 | Lighting uniformity checked by measuring lux at four corners and the centre of the inspection field. Non-uniform lighting creates shadow artefacts that cause false rejects or missed detections. | Daily | Max variance between measurement points below 15% of centre reading. Any hotspot or shadow logged immediately. | ________ | |
| 2.3 | Diffuser panels and light shields inspected for product contamination, scratches, or discolouration. Contaminated diffusers create uneven transmission that is not detectable by a simple lux reading. | Weekly | Diffuser optically clear with no product build-up or scratches. Cleaned or replaced if contamination is found. | ________ | |
| 2.4 | LED array operating hours logged and compared against OEM replacement threshold. Running hours recorded against the lighting asset record in CMMS for lifecycle tracking. | Weekly | Operating hours logged. Replacement triggered at OEM threshold or if output has dropped >15% from baseline. | ________ | |
| 2.5 | Full photometric measurement of LED array output recorded using calibrated lux meter and spectrophotometer where applicable. Results compared against commissioning baseline and archived in CMMS. | Monthly | Output within 10% of commissioning baseline. Colour temperature stable within ±150K for colour-sensitive applications. | ________ |
| # | Task | Schedule | Acceptance Criteria | Sign-Off | |
|---|---|---|---|---|---|
| 3.1 | False reject rate reviewed against the rolling 7-day baseline. A sudden increase in false rejects without a corresponding change in product quality indicates model or environment drift. | Daily | False reject rate within ±1.5% of 7-day baseline. Increases above threshold trigger immediate model review. | ________ | |
| 3.2 | False pass rate and missed defect log reviewed against acceptable quality level. Any confirmed defect escape — product passing AI inspection that a human inspector would have caught — is logged as a critical event. | Daily | Zero confirmed defect escapes. Any escape triggers model review and a corrective action work order. | ________ | |
| 3.3 | Detection threshold settings reviewed against current production output and defect profile. Any threshold change since the last review is logged with justification, approver, and effective date. | Weekly | Thresholds within approved operating range. Any undocumented change raises a non-conformance. Change log current. | ________ | |
| 3.4 | Edge case and borderline detection log reviewed. Images from the current week where the model scored close to the decision boundary are sampled and assessed by a human inspector for classification accuracy. | Weekly | Human reclassification agreement rate above 95%. Systematic misclassification patterns trigger model drift review. | ________ | |
| 3.5 | Full model drift assessment conducted: performance metrics trended over 30 days, training data relevance reviewed against current product/packaging specs, and retraining decision documented with sign-off from Quality and AI Engineering. | Monthly | Drift below retraining threshold, or retraining initiated and documented. Monthly review record filed in CMMS. | ________ |
| # | Task | Schedule | Acceptance Criteria | Sign-Off | |
|---|---|---|---|---|---|
| 4.1 | Trigger sensor signal confirmed active and responding at correct position using the system diagnostic panel. Trigger timing verified against the inspection window for the current product and line speed. | Daily | Trigger firing at correct position. No missed or double-trigger events in previous shift log. Timing within spec. | ________ | |
| 4.2 | Encoder feedback signal checked for slip or dropout events. Encoder counts compared against expected line speed values in the control system. Any slip event is logged and escalated to automation. | Daily | No encoder slip or dropout events in previous shift. Count variance within ±0.1% of expected value. | ________ | |
| 4.3 | Network latency between camera, vision processor, and PLC/MES measured and compared against the maximum allowable latency for the line speed. Latency above threshold causes rejection signal timing errors. | Weekly | Network latency below maximum threshold defined in system specification. Results logged per device pair. | ________ | |
| 4.4 | Data pipeline integrity verified — inspection results confirmed reaching the MES, quality system, and rejection control output correctly. End-to-end test run with known-reject product or test signal. | Weekly | Test signal triggers rejection output correctly. Data confirmed in quality system and MES. No dropped results. | ________ | |
| 4.5 | Full I/O audit: all cable connections inspected for wear, pinch, or corrosion at camera, trigger sensor, encoder, rejection solenoid, and network switch. Connector pin condition checked and re-seated where needed. | Monthly | All cables intact with no damage or corrosion. Connectors secure and pin condition confirmed. Findings logged. | ________ |
System Sign-Off — Issued When All 4 Zones Are Confirmed
PM Done ✓
PM Done ✓
PM Done ✓
PM Done ✓
Sign-Off
Performance Metrics — AI Vision Inspection PM Programme
Percentage of defects correctly detected across all inspection stations. AI tracking surfaces which cameras or stations show accuracy drift — enabling targeted recalibration before escapes occur.
Percentage of conforming product incorrectly rejected. Rising false rejects without a quality change signal model drift, lighting degradation, or calibration slip — not a product problem.
Confirmed defective units passing AI inspection undetected. Target zero. Any confirmed escape triggers a mandatory four-zone PM review before the line restarts.
Percentage of scheduled vision system PM tasks completed on time. Target 100% — missed PM tasks are the leading predictor of accuracy degradation events on FMCG inspection lines.







