We built a computer vision quality-control pipeline for high-speed assembly lines, capable of flagging defects like scratches, misalignments, missing parts, and wrong labels in real time. The system inspects every unit on the line, not just random samples, and pushes decisions directly into the plant’s MES.
Using optimized PyTorch models and an OpenCV-based preprocessing stack, the solution runs on edge devices on the shop floor, delivering sub-100ms inference per frame. Operators see live overlays, alerts, and hold actions without slowing down throughput.
The system reduced manual inspection load, tightened process capability, and gave operations real-time visibility into where and when defects appear along the line.
Manual visual inspection was:
We introduced an AI-driven vision QC pipeline:
The Vision QC Inspector delivered:
The vision pipeline runs on edge hardware mounted near the line, with models tuned specifically for noisy factory conditions (glare, vibration, part variation). The plant’s MES only sees simple pass/fail events and defect codes — all the heavy AI work is local.
Models trained to recognize multiple defect types per product family, with configurable severity thresholds per plant.
Runs entirely at the edge to avoid cloud latency and keep production data inside the factory network.
When an operator overrides a decision, that data is logged and can be used to retrain and improve the models over time.
Per-line and per-shift defect dashboards help quality and process engineers focus on the highest-impact issues first.
The setup is designed to be replicated: new lines or plants can be onboarded by calibrating cameras, capturing a dataset, and reusing the same core pipeline.