Estimating object speed in visual scenes is a critical part of perception. While various aspects of speed computation including discrimination thresholds, neural mechanisms and spatial integration mechanisms have been studied, there remain areas to elucidate. One is the integration of information across spatio-temporal frequency channels to compute speed. We probe this integration with a 2-AFC psychophysical task in which moving random phase noise stimuli are used with experimenter defined frequency parameters and bandwidths to target specific neural populations. They are presented for 300ms in a large square aperture with smooth eye movements recorded while speed discrimination judgements are made over two intervals. There is no instruction to observers to pursue the stimuli and no pre trial saccade to induce a classic ocular following response. After a latency, eye movements follow the stimulated direction presumably to facilitate the speed judgement. Within each of the two intervals, we randomly vary a range of spatial frequency and speed parameters respectively such that stimuli at the centre of the ranges are identical. The aim is to characterise the speed response of the eye movements recorded in a context which creates an ocular motor ‘action’ during a perceptual task instead of artificially separating the two. Within the speed varied intervals, averaged eye movements are systematically modulated in strength by stimulus speed. Within the spatial frequency intervals, higher frequencies perceived as faster in discrimination responses interestingly show no corresponding strengthening of eye responses particularly at higher contrasts where they may be weaker. Thus for a pair of stimuli matched for contrast and perceived speed, this early eye response appears to be driven by a contrast dependent low level motion energy like computation. We characterise an underlying spatial frequency response which is shifted towards lower frequencies, unlike the perceptual responses and is probably separate from perception.