There is a paper out of the University of Michigan, published recently in Analytical Methods, that should be required reading for anyone who cites environmental microplastics data, and of course, for anyone who thinks following a standard protocol 100% guarantees a clean result.
The short version: researchers confirmed that simply touching a laboratory sample with a gloved hand deposits residues that get misidentified as microplastic particles by the analytical instruments commonly used to detect them. Not because the instruments are broken. Not because the researchers were sloppy. Because the gloves themselves are contaminating the samples, and the standard quality control recommendations said to wear gloves.
The protective measure intended to prevent contamination was inadvertently generating contamination…
What the paper actually found
The research team at Michigan studied seven common disposable laboratory glove types (nitrile and latex) under controlled conditions. They pressed each glove against a clean substrate at a force similar to normal hand pressure and then analyzed what transferred. Using sophisticated spectroscopic analysis, they found that dry contact alone (no soaking, no wet transfer, just a touch) deposited stearate salts onto the surface. Stearate salts are a byproduct of how gloves are manufactured, they’re used as mold-release agents during production.
Here’s where it gets genuinely interesting from a water quality standpoint: stearate salts have a molecular structure similar enough to polyethylene that the automated spectral matching algorithms used in microplastics analysis couldn’t reliably tell them apart. The instruments were seeing something that looked like plastic, calling it plastic, and moving on. The mean false positive count was around 2,000 per square millimeter of analyzed surface from standard nitrile and latex gloves. One particularly generous glove variety produced over 7,000 false positives per square millimeter.
The researchers went back and looked at an actual environmental dataset (atmospheric microplastics collected from four Michigan locations) and found that glove contamination had inflated their polyethylene counts significantly. Data they had collected following accepted QA/QC protocols was not usable for its intended purpose.
This is not an isolated problem
The authors reviewed 26 recent review articles on microplastics quality assurance published between 2018 and 2024. Eighty-one percent of those reviews recommended wearing gloves to protect samples. Only two mentioned that direct contact with samples should be limited, and here’s the part that should give everyone pause: even after a 2020 paper first identified glove-based contamination (under wet conditions), the rate at which subsequent publications recommended glove use barely changed.
The field absorbed a finding that its standard practice was introducing error and largely continued doing what it was doing. That’s not a critique of individual researchers. That’s how institutional momentum works, and it’s exactly the kind of problem that requires persistent, honest scrutiny.
Why this matters
In water treatment, we deal with contaminant testing every day. We send samples to labs. We interpret results. We make recommendations to customers based on those results. At every stage of that chain, there are sources of error that have nothing to do with what’s actually in the water.
I’ve seen this play out in real situations. Elevated lead results traced back to improper sample collection. PFAS detections that were later attributed to sampling equipment. Nitrate readings that fluctuated wildly between labs on split samples from the same well. Some of those discrepancies get investigated, but a lot of them don’t.
The microplastics story is a particularly clean illustration of something we need to be reminded of: the act of measurement can change what you’re measuring. Every time a sample is collected, handled, transferred, stored, processed, and analyzed, there are opportunities to introduce error. Some of those errors elevate contaminant counts. Some suppress them. Protocol compliance reduces the risk, but it does not eliminate it.
What bothers me more than the error itself is the response to it. The Michigan team’s paper was necessary because a 2020 finding about glove contamination produced almost no change in how subsequent researchers handled samples. If your protocol says “wear gloves,” you wear gloves. That’s compliance. It’s not the same as understanding why the protocol exists and whether it’s achieving its intended purpose.
The honest position
Science requires honesty about its own limitations. Not as a disclaimer buried at the end of a paper, but as a core operating principle. The researchers at Michigan deserve credit not just for finding the problem, but for publishing data from a contaminated study rather than discarding it quietly. That’s integrity. They turned an error into a tool for helping others identify and correct the same error in their own datasets.
That’s the standard I hold myself and my work to. When we test water, we should be asking not just what the instrument says, but whether we trust the conditions under which that reading was taken. When we interpret published data (whether it’s about microplastics, PFAS, lead, arsenic, or anything else), we should be asking who collected it, how, and what the known sources of error are.
This is not anti-science. This is science done right. Skepticism about methodology is not the same as skepticism about the underlying phenomenon.
Microplastics in the environment are real. PFAS contamination is real. Lead in drinking water is real. Acknowledging that our measurement tools and procedures have documented failure modes is how we produce results that actually mean something.
If the gloves meant to protect your sample are the ones contaminating it, the honest answer is to say so, figure out how to fix it, and update the protocol. That’s exactly what the Michigan team did.
The water treatment industry has an opportunity here. We sit at the intersection of public health, environmental science, and analytical chemistry. Our customers trust us to interpret testing data accurately and honestly — not just to hand them a lab report and move on. That trust is earned by asking hard questions about methodology, not just accepting results at face value.
As an industry, we have every reason to lead on this. When we hold our own testing practices to a higher standard, when we treat “the protocol says so” as a starting point rather than a final answer, we become more credible advocates for the customers we serve and more effective partners in the broader conversation about water quality and environmental protection. That’s not a burden, that’s our strength.
Read the full paper here:
https://pubs.rsc.org/en/content/articlepdf/2026/ay/d5ay01801c