Dr.Shin'ya Nishida
时间: 2012-06-22 12:30 - 14:30
地点: 哲学楼103房间
Motion perception is one of the most successful research areas in vision science, but recent research has revealed that motion processing is more complex than previously thought [1].
Image motion is first detected by direction-selective neural sensors, with each being tuned to a given combination of position, orientation, spatial frequency and feature type (e.g., first-order and second-order). When motion signals are integrated across position and orientation, the visual system adaptively switches between two spatial integration strategies depending on the ambiguity of local motion signals [2]. The visual system can integrate motion signals of different spatial frequencies and different feature types, but only when form conditions support grouping of local motions [3] — form processing exerts a strong control over the motion integration process.
On the other hand, motion signals affect processing of the form and color of moving objects. Specifically, the sensory signals about the objects’ form/color are integrated along the trajectory of the moving object [4]. This trajectory integration improves the visibility of the form/color of the moving object by means of temporal integration without introducing motion blur.
If time permits, I would also like to briefly explain our studies on material perception [5].
[1] Nishida, S. (2011). Journal of Vision, 11(5), 11, 1-53.
[2] Amano, K., Edwards, M., Badcock, D.R & Nishida, S., (2009). Journal of Vision, 9(3):4, 1-25.
[3] Maruya, K., Amano, K. & Nishida, S. (2010). Vision Res. 2010, 50(11):1054-64; Maruya, K., & Nishida, S. (2010). Journal of Vision, 2010, 10(13):24. 1-18.
[4] Nishida, S. (2004). Current Biology, 14, 830-839; Nishida, S., Watanabe, J., Kuriki, I. & Tokimoto, T. (2007), Current Biology, 17(4), 366-372.; Watanabe, J. & Nishida, S. (2007). Journal of Vision, 7(11), 1-16.
[5] Motoyoshi, I., Nishida, S., Sharan, L. & Adelson, E.H. (2007). Nature, May 10; 447(7141): 2006-2009.
2012-06-22
2012-06-22