Christo Wilson's Dossier for Full Professor Promotion

This page contains materials related to Christo Wilson's case for promotion to full professor.

Short Bio

Christo Wilson is an Associate Professor in the Khoury College of Computer Sciences at Northeastern University. He is a founding member of the Cybersecurity and Privacy Institute at Northeastern and serves as director of the BS in Cybersecurity program. Professor Wilson's research spans many topics related to online consumer protection, such as cybersecurity, privacy, deception, misinformation, and bias in machine learning. Professor Wilson is a leader in the discipline of algorithmic auditing, which is an interdisciplinary area that uses experimental techniques to measure the black-box algorithmic systems that pervade daily life in order to increase transparency and accountability of these systems. His work is supported by the U.S. National Science Foundation, a Sloan Fellowship, Northwestern University, Underwriters Laboratories, the Mozilla Foundation, the Knight Foundation, the Russell Sage Foundation, the Democracy Fund, the Anti Defamation League, the Data Transparency Lab, the European Commission, Google, Pymetrics, and Verisign Labs.

Statements, CV, etc.

Sample Publications (Post Tenure)

  1. Shan Jiang and Ronald E. Robertson and Christo Wilson. Bias Misperceived: The Role of Partisanship and Misinformation in YouTube Comment Moderation. In Proceedings of the International AAAI Conference on Weblogs and Social Media (ICWSM 2019). Munich, Germany, June, 2019.
    Synopsis: In this study we investigate and debunk the rumor that content moderation on social media is systematically biased against political conservatives, using video comment moderation on YouTube as a case study. The study was led by my PhD students Shan Jiang (now at Meta) and Ron Robertson (co-advised with David Lazer; now a postdoc at Stanford). This paper won an Outstanding Analysis Award at ICWSM 2019 and was invited to appear at AAAI 2020.
  2. Christo Wilson and Avijit Ghosh and Shan Jiang and Alan Mislove and Lewis Baker and Janelle Szary and Kelly Trindel and Frida Polli. Building and Auditing Fair Algorithms: A Case Study in Candidate Screening. In Proceedings of the Conference on Fairness, Accountability, and Transparency (FAccT 2021). Virtual Event, Canada, March, 2021.
    Synopsis: In this study we develop methods for outside experts to perform "cooperative" bias audits of machine learning systems used by private companies. As a case study, we present the results of an audit using these methods of the machine learning algorithms used by a candidate screening software provider called Pymetrics. I led this audit with assistance from my PhD students Avijit Ghosh and Shan Jiang, as well as assistance from Pymetrics employees (Lewis, Janelle, Kelly, and Frida).
  3. Maggie Van Nortwick and Christo Wilson. Setting the Bar Low: Are Websites Complying With the Minimum Requirements of the CCPA?. Proceedings on Privacy Enhancing Technologies (PoPETS), 2022(1), January, 2022.
    Synopsis: This study presents the first large-scale investigation of websites' compliance with the California Consumer Privacy Act (CCPA), the strongest general-applicability online privacy law in the US. A notable facet of the study is that we had to determine which websites the law applied to, which we solved by estimating the unique visitors to each website from California. This study was led by Northeastern undergraduate Maggie Van Nortwick who was funded on an NSF REU. Maggie won an Undergraduate Research Award from Northeastern for this study.
  4. James Larisch and Waqar Aqeel and Michael Lum and Yaelle Goldschlag and Leah Kannan and Kasra Torshizi and Yujie Wang and Taejoong Chung and Dave Levin and Bruce M. Maggs and Alan Mislove and Bryan Parno and Christo Wilson. Hammurabi: A Framework for Pluggable, Logic-based X.509 Certificate Validation Policies. In ACM Conference on Computer and Communications Security (CCS 2022). Los Angeles, CA, November, 2022.
    Synopsis: In this paper we present a system called Hammurabi that enables TLS clients (e.g., web browsers, command line tools, libraries like OpenSSL, etc.) to separate X.509 certificate validation policy (e.g., minimum key sizes, maximum certificate lifetimes, etc.) from mechanism (e.g., parsing X.509, verifying cryptographic signatures, etc.). Our prototype implementation uses Prolog-based logic programs to specify certificate validation policies. We reimplemented Firefox and Chrome's validation policies in Hammurabi to demonstrate the benefits of our approach. The project was led by former Northeastern undergraduate James Larisch and Waqar Aqeel (advised by Bruce Maggs) with help from a small army of undergrads from University of Maryland (advised by Dave Levin). This paper was the culmination of years of work funded by an NSF Large grant and it received a Honorable Mention Award at CCS 2022.
  5. Ronald E. Robertson and Jon Green and Damian J. Ruck and Katherine Ognyanova and Christo Wilson and David Lazer. User choice outweighs algorithmic curation for partisan news on Google Search. Nature, 2023.
    Synopsis: This audit investigated the "filter bubble" hypothesis on Google Search, i.e., the theory that personalization of political search results can result in a situation where partisans are only shown information by Google that is congruent with their pre-existing beliefs. Using a two-wave study with hundreds of participants spanning 2018 and 2020, we demonstrate that Google presents all users with comparable political information. In contrast, partisans choose to click on links and to visit websites in general that conform to their own political beliefs. The study was led by my PhD student Ron Robertson (co-advised with David Lazer; now a postdoc at Stanford) and Jon Green, with assistance from Damian Ruck and Katya Ognyanova.