Group Member: Mu Cai, Yunyu(Bella) Bai, Xuechun Yang
Source code and dataset available at: https://github.com/mu-cai/cs766_21spring
Toward Robust Visual Recognition System under Group Shifts
Recently, people are making more and more efforts to strive for fairness. While human beings can be a main source of bias, machines can also be biased when generating prediction results. As the frequency of using computers to make decisions, fairness-enhancing techniques have received a lot of attention. Our project focuses on reducing bias generated by image classification algorithms, and thus improve the worst-case performance in image classification.
Featured Work
Our Proposal
This proposal explained what problem we are trying to solve, why we want to solve it, and what are the possible steps to the solution. [LINK]
Our Mid-term report
Our mid-term report includes a brief summary of progress before mid-term, our results, the difficulties that appeared during the implementation, and how our proposal may have changed in light of current progress.​ [LINK]