Many organisms have highly evolved neural circuits for processing visual motion cues. However, most laboratory studies of visual motion perception are performed under highly simplified conditions in which there is no self-motion such that object motion in the world directly maps onto retinal image motion. Under many natural conditions, however, we must judge object motion during self-motion, which greatly complicates the problem. Thus, the brain needs to parse the complex pattern of retinal image motion into components that correspond to object motion and self-motion. In addition, to compute object motion in world coordinates, the brain must estimate self-motion and factor it into computations of object motion. I will describe two studies that make important progress into understanding the visual and multi-sensory mechanisms by which the brain computes object motion during self-motion. I will show that neural activity in macaque area MT reflects the operation of a “flow parsing” mechanism (which has been previously established in human psychophysics) that discounts global optic flow resulting from self-motion. I will also show that neural activity in area VIP reflects flexible computation of object motion in either world- or head-centered coordinates. Together, these studies begin to reveal critical neural processes that are involved in perceiving object motion under more natural conditions in which self-motion also occurs.