Dr. Sos Agaian is a Distinguished Professor of Computer Science at the Graduate Center and the College of Staten Island, CUNY. Before joining CUNY, he was the Peter T. Flawn Professor at the University of Texas at San Antonio. He also served as a Visiting Professor at Tufts University and a Lead Scientist at Aware, Inc. in Boston, Massachusetts. His research spans computational vision, machine learning, AI, multimedia security, remote sensing, and biomedical imaging. Dr. Agaian has received funding from NSF, DARPA, Google, and other agencies. He has published over 850 articles, 10 books, and 19 book chapters and holds 56 patents/disclosures, many of which have been licensed. He has mentored 45 PhD students and received multiple awards for research and teaching, including the MAEStro Educator of the Year, the Distinguished Research Award, the Innovator of the Year, the Tech Flash Titans-Top Researcher Award, and recognition as an Influential Member of the School of Engineering at Tufts University. He is an Associate Editor for several journals, including the IEEE Transaction of Cybernetics. He is a fellow of the Society for Imaging Science and Technology (IS&T), the Optical Society of America (SPIE), the American Association for the Advancement of Science (AAAS), Institute of Electrical and Electronics Engineers (IEEE), The Asia-Pacific Artificial Intelligence Association (AAIA), and Member of Academia Europaea. He has delivered over 35 keynote speeches and 100 invited talks and co-founded/chaired over 200 international conferences. He has also been a Distinguished IEEE Systems, Man, and Cybernetics Society Lecturer.
Title: Bio-Inspired No-Reference Image Quality Assessment: From Human Visual Systems to Computational Models
Abstract: Image Quality Assessment (IQA) poses a significant challenge in computer vision, especially without reference images. While traditional metrics such as Mean Squared Error (MSE) have been prevalent in the field, their correlation with human perception is still limited. This keynote presents a comprehensive framework for no-reference (blind) IQA using bio-inspired computational models that connect human and machine vision systems. By incorporating computational neuroscience and cognitive science insights, our approach emulates key human visual system characteristics to create more perceptually accurate quality metrics. We illustrate how bio-inspired architectures can effectively evaluate image quality across various distortions encountered during image acquisition, compression, transmission, and storage, surpassing conventional metrics in alignment with human judgments. The presentation will address fundamental questions about the limitations of MSE-based approaches and introduce novel bio-inspired solutions for blind IQA. We present recent breakthroughs in our research, including new computational models that achieve state-of-the-art performance on standard IQA benchmarks while maintaining biological plausibility. These advances improve quality assessment accuracy and provide insights into the underlying mechanisms of human visual perception. Our findings have significant implications across multiple domains, from consumer photography and video streaming to medical imaging and computer vision systems. The talk will conclude with a discussion of future research directions and the potential impact of bio-inspired IQA on emerging visual technologies.