The visual search paradigm is a standard lab approach to examine how we find behaviourally relevant objects in a complex environment. Typically in this paradigm items are static and noiseless. This approach strongly contrasts with our dynamic and noisy natural environment. To understand visual search in a more natural environment, we developed a novel search paradigm with multiple random dot kinematogram (RDK) apertures. To our knowledge, ours is the first such study with humans to explore the influence of motion and noise on visual search. In our design, participants were asked to search for an RDK aperture with a specific direction of coherent motion (e.g. left) among a varying number of RDKs (5, 10, 15) containing movement in the opposite direction In addition, we manipulated the level of coherence (65%, 80%, 90%) of the RDKs in a blocked fashion. The target motion was only present on half the trials, and interestingly, we found that search slopes for target-present trials were negative, most strongly the highest noise condition. Such negative slopes are similar to those seen in texture segmentation paradigms, where perceptual grouping processes allow the perception of a global ‘texture region’. Here, the target causes a local texture gradient, facilitating attentional capture. This contrast is more pronounced with larger set sizes. As the negative slope is larger with higher levels of noise, we stipulate that participants rely more on this ‘texture’ effect with increasing noise. We also compared our findings to a static, colour pop-out version of the task (i.e., similar item size, display geometry, etc.). Results suggest the searches were performed differently, implying that motion may be unique in the way it guides visual search.