As detector technologies improve, the increase in resolution, number of channels and overall size create immense bandwidth challenges for the data acquisition system, long post acquisition processing times and growing data storage costs. Much of the raw data does not contain useful information and can be significantly reduced with veto and compression systems. The improvements in artificial intelligence (AI), particularly the many flavours of machine learning (ML), adds a powerful tool to data acquisition strategies. Leveraging ML’s flexibility and versatility, we propose to embed intelligent algorithms at the edge, that is, early in the detector chain, including within the photon and particle detector readouts themselves, to veto, analyze and compress the data in real-time. Placing ML algorithms at the edge of the system includes some of the same challenges as non-AI trigger systems, such as managing power, data flow and calibrations. However, it also adds some new elements to consider such as model training, continuous model updating and stringent data provenance tracking for the data used to train these models. This presentation will discuss the strategic importance of developing tools to aid in the design of embedded ML and building strong validation strategies for ML models used in scientific instruments.