top of page

Toward Robust Visual Recognition System under Group Shifts

Recently, people are making more and more efforts to strive for fairness. While human beings can be a main source of bias, machines can also be biased when generating prediction results. As the frequency of using computers to make decisions, fairness-enhancing techniques have received a lot of attention. Our project focuses on reducing bias generated by image classification algorithms, and thus improve the worst-case performance in image classification.

Home: Welcome

Featured Work

Home: Featured Work
seabirds-train_edited.jpg

Dataset

- Natural Images crawled from the online sources
- Contains group-shifted data

Our Proposal

This proposal explained what problem we are trying to solve, why we want to solve it, and what are the possible steps to the solution. [LINK]

Home: About

Our Mid-term report

Our mid-term report includes a brief summary of progress before mid-term, our results, the difficulties that appeared during the implementation, and how our proposal may have changed in light of current progress.​ [LINK]

Home: About

Contact Us

Thanks for submitting!

Home: Contact
  • Facebook
  • Twitter
  • LinkedIn

©2021 by CS 766 Spring 2021 Final Project. Proudly created with Wix.com

bottom of page