##SENDGRIDOPENTRACKING##
Email not displaying correctly? View in browser
 

Bulletin #3 Friday 12th April, 2024

 

Important Dates & Reminders

Friday, May 3, 2024 Dissertation, PhD Final Exam and change of grade forms due to TGS for Spring PhD candidates

Monday, May 20, 2024 Registration for Fall 2024 begins

Monday, May 27, 2024 Memorial Day (no classes)

Saturday, June 1, 2024 Spring classes end

 

We want to hear from you! Please send any upcoming news and events to news@cs.northwestern.edu to be included in future bulletins &/featured on our socials/website.

Events must be emailed at least two (2) business days in advance.

 

In this Issue

Upcoming Seminars:

Monday 15th April

"Simpler Machine Learning Models for a Complicated World" (Cynthia Rudin)

 

Friday 19th April
"Hardware-Aware Efficient Primitives for Machine Learning" (Dan Fu)

 

Wednesday 24th April

"AI Powered Movement Analysis, Big Rehabilitation Data and a Path to Precision Rehabilitation" (James Cotton)

 

CS Events

 

Northwestern Events

 

News

Upcoming CS Seminars

Missed a seminar? No worries!

View past seminars via the Northwestern CS Website

(northwestern login required).

View Past Seminars
 

April

Friday 12th - Juncheng Yan

Monday, 15th - Cynthia Rudin

Friday, 19th - Dan Fu

Wednesday 24th - James Cotton

Monday, 29th - Kaize Ding  

 

May

Wednesday, 1st - Tong Zhang

Friday, 3rd - Shafi Goldwasser

 

Monday / CS Distinguished Lecture
April 15th / 12:15 PM

Hybrid / Kellogg Global Hub 1120

Hosted with the Kellogg Operations Department

Cynthia Rudin, Duke University

"Simpler Machine Learning Models for a Complicated World"

Abstract

While the trend in machine learning has tended towards building more complicated (black box) models, such models have not shown any performance advantages for many real-world datasets, and they are more difficult to troubleshoot and use. For these datasets, simpler models (sometimes small enough to fit on an index card) can be just as accurate. However, the design of interpretable models is quite challenging due to the "interaction bottleneck" where domain experts must interact with machine learning algorithms.

 

I will present a new paradigm for interpretable machine learning that solves the interaction bottleneck. In this paradigm, machine learning algorithms are not focused on finding a single optimal model, but instead capture the full collection of good (i.e., low-loss) models, which we call "the Rashomon set." Finding Rashomon sets is extremely computationally difficult, but the benefits are massive. I will present the first algorithm for finding Rashomon sets for a nontrivial function class (sparse decision trees) called TreeFARMS. TreeFARMS, along with its user interface TimberTrek, mitigate the interaction bottleneck for users. TreeFARMS also allows users to incorporate constraints (such as fairness constraints) easily.

I will also present a "path," that is, a mathematical explanation, for the existence of simpler-yet-accurate models and the circumstances under which they arise. In particular, problems where the outcome is uncertain tend to admit large Rashomon sets and simpler models. Hence, the Rashomon set can shed light on the existence of simpler models for many real-world high-stakes decisions. This conclusion has significant policy implications, as it undermines the main reason for using black box models for decisions that deeply affect people's lives.

 

This is joint work with my colleagues Margo Seltzer and Ron Parr, as well as our exceptional students Chudi Zhong, Lesia Semenova, Jiachang Liu, Rui Xin, Zhi Chen, and Harry Chen. It builds upon the work of many past students and collaborators over the last decade.

 

Here are papers I will discuss in the talk:

 

Rui Xin, Chudi Zhong, Zhi Chen, Takuya Takagi, Margo Seltzer, Cynthia Rudin

Exploring the Whole Rashomon Set of Sparse Decision Trees, NeurIPS (oral), 2022.

https://arxiv.org/abs/2209.08040

 

Zijie J. Wang, Chudi Zhong, Rui Xin, Takuya Takagi, Zhi Chen, Duen Horng Chau, Cynthia Rudin, Margo Seltzer

TimberTrek: Exploring and Curating Sparse Decision Trees with Interactive Visualization, IEEE VIS, 2022.

https://poloclub.github.io/timbertrek/

 

Lesia Semenova, Cynthia Rudin, and Ron Parr

On the Existence of Simpler Machine Learning Models. ACM Conference on Fairness, Accountability, and Transparency (ACM FAccT), 2022.

https://arxiv.org/abs/1908.01755

 

Lesia Semenova, Harry Chen, Ronald Parr, Cynthia Rudin

A Path to Simpler Models Starts With Noise, NeurIPS, 2023.

https://arxiv.org/abs/2310.19726

 
Biography

Cynthia Rudin is the Earl D. McLean, Jr. Professor of Computer Science and Engineering at Duke University. She directs the Interpretable Machine Learning Lab, and her goal is to design predictive models that people can understand. Her lab applies machine learning in many areas, such as healthcare, criminal justice, and energy reliability. She holds degrees from the University at Buffalo and Princeton. She is the recipient of the 2022 Squirrel AI Award for Artificial Intelligence for the Benefit of Humanity from the Association for the Advancement of Artificial Intelligence (the “Nobel Prize of AI”). She received a 2022 Guggenheim fellowship, and is a fellow of the American Statistical Association, the Institute of Mathematical Statistics, and the Association for the Advancement of Artificial Intelligence.

 

Zoom: https://northwestern.zoom.us/j/95918818964?pwd=TFAzbFVENE9KWXlNYkRNZzI1MXROUT09

 

Friday / CS Seminar
April 19th / 12:00 PM

In Person / Mudd 3514

Dan Fu, Stanford University

"Hardware-Aware Efficient Primitives for Machine Learning"

Abstract

Efficiency is increasingly tied to quality to machine learning, with more efficient training algorithms leading to more powerful models. However, today's most popular machine learning models are built on asymptotically inefficient primitives. For example, attention in Transformers scales quadratically in the input size, while MLPs scale quadratically in model dimension. In this talk, I discuss my work on improving the efficiency of the core primitives in machine learning, with an emphasis on hardware-aware algorithms and long-context applications. First, I focus on replacing attention with gated state space models (SSMs) and convolutions, which scale sub-quadratically in context length. I describe the H3 (Hungry Hungry Hippos) architecture, a gated SSM architecture that matches Transformers in quality up to 3B parameters and achieves 2.4x faster inference. Second, I focus on developing hardware-aware algorithms for SSMs and convolutions. I describe FlashFFTConv, a fast algorithm for computing SSMs and convolutions on GPU by optimizing the Fast Fourier Transform (FFT). FlashFFTConv yields up to 7x speedup and 5x memory savings, even over vendor solutions from Nvidia. Third, I will briefly touch on how these same techniques can also be used to develop sub-quadratic scaling in the model dimension. I will describe Monarch Mixer, which uses a generalization of the FFT to achieve sub-quadratic scaling in both sequence length and model dimension. Throughout the talk, I will give examples of how these ideas are beginning to take hold, with gated SSMs and their variants now leading to state-of-the-art performance in long-context language models, embedding models, and DNA foundation models.

 
Biography

 

Dan Fu is a PhD student in the Computer Science Department at Stanford University, where he is co-advised by Christopher Ré and Kayvon Fatahalian. His research interests are at the intersection of systems and machine learning. Recently, he has focused on developing algorithms and architectures to make machine learning more efficient, especially for enabling longer-context applications. His research has appeared as oral and spotlight presentations at NeurIPS, ICML, and ICLR, and he has received the best student paper runner up at UAI. Dan has also been supported by an NDSEG fellowship.

 

Research Area/Interests
machine learning, systems

Wednesday / CS Seminar
April 24th / 12:00 PM

Hybrid / Mudd 3514

R. James Cotton, Northwestern University & Shirley Ryan AbilityLab

"AI Powered Movement Analysis, Big Rehabilitation Data and a Path to Precision Rehabilitation"

Abstract

This talk will discuss multiple methodological lines of work making movement and gait analysis more clinically accessible and biomechanically grounded. This includes reconstruction from synchronized multiview videos, smartphone videos, and wearable sensors. We will also discuss how implicit functions provide a powerful representation to map from time to joint angles, and GPU accelerated methods that enable end-to-end biomechanical fits from these different modalities. It will discuss some of the opportunities that large movement data enables, including the use of self-supervised learning to discover gait representations that can function as both diagnostic and response biomarkers. Finally, we will outline a vision for a Causal Framework for Precision Rehabilitation that can model this data to link from impairment to function and identify the optimal dynamic treatment policies to improve rehabilitation outcomes.

 
Biography

I am an electrical engineer, neuroscientist, and physiatrist working as a physician-scientist at Shirley Ryan AbilityLab and Assistant Professor in the Northwestern University Department of Physical Medicine and Rehabilitation. I completed my residency in PM&R at Shirley Ryan AbilityLab (formers Rehabilitation Institute of Chicago) where I remained as faculty. Prior to that I obtained a B.S. in Electrical Engineering from Rice University followed by an MD and PhD in systems neuroscience from Baylor College of Medicine. My lab works at the intersection of artificial intelligence, wearable sensors, computer vision, causal and biomechanical modeling, and novel technologies to more precisely monitor and improve rehabilitation outcomes.

 

Research Area/Interests

AI, computer vision, gait analysis, wearable sensors, rehabilitation

 

Zoom: https://northwestern.zoom.us/j/96053323898?pwd=R2pzNjBhWmtjeldmbU1UNlhHUjc4QT09

 

CS Department Events

Save The Date: End of Year Awards

Save the date for the annual end of year department awards presentation. This event will take place May 30th. Stay tuned for more details.

Thursday, May 30, 2024
3:00PM - 5:00PM

TBA

VentureCat 2024 Applications are Open

VentureCat Applications are Open!

 

Calling all student founders at Northwestern University: VentureCat 2024 Applications will be open Monday, March 25 through Sunday, April 7!

 

VentureCat is Northwestern’s annual student startup competition, where the university’s most promising student founded startups compete for a prize pool of over $175,000 in non-dilutive funding.

 

Now is your chance to compete – apply here.

Application Period: Monday, March 25 through Sunday, April 7

Virtual

Apply»

Mechanical Engineering Seminar

John Schlueter

Please join us on Tuesday, April 16th, 11am-12pm in Tech B211 or on Zoom for a seminar from John Schlueter, NSF DMREF Program Manager.

 

Zoom link: 

https://northwestern.zoom.us/j/96434867335

 

Tuesday, April 16th, 11am-12pm

Hybrid
Tech B211/Zoom

Zoom Link»

Northwestern Medicine Healthcare AI Forum

The Northwestern Medicine Healthcare AI Forum dives into cutting-edge developments in the field of AI for healthcare. Presenters share the latest published research and technology innovation, and facilitate discussion among attendees.

 

Open to the entire Northwestern Medicine community, the forum is presented by the Center for Collaborative AI in Healthcare, Institute for Artificial Intelligence in Medicine (I.AIM). 

Fridays Bi-Weekly 10:00 AM CT

Hybrid

Register »

Professor Emeritus Bruce Wessels Passes Away

Wessels will be remembered for his notable contributions to the study of thin films and nanostructures and passion for service and scholarship.

 

Read More

Building Connections with San Francisco Bay Area Employers and Alumni

Twenty-three students traveled to the San Francisco Bay Area over spring break as part of a new Northwestern Computer Science career development pilot initiative.

 

Read More

Three New Affiliate Faculty Members Join Northwestern Computer Science

Professors Nivedita Arora, Matt Groh, and Ermin Wei join the team of affiliate faculty members helping to advance the department’s multidisciplinary academic and research mission.

 

Read More

View all News »

Eminent Epidemiologist Mercedes Carnethon Named Chair of Preventive Medicine

Mercedes Carnethon, PhD, vice chair and Mary Harris Thompson Professor of Preventive Medicine, has been named chair of the Department of Preventive Medicine, effective September 1.

 

Read More

© Robert R. McCormick School of Engineering and Applied Science, Northwestern University

Northwestern Department of Computer Science

Mudd Hall, 2233 Tech Drive, Third Floor, Evanston, Illinois, 60208

Unsubscribe