![]() |
![]() |
- Developed the first prototype model for targeting omni-shoppers, delivering strong offline results that justified advancing the model to online experimentation.
- Played a key role in launching a new model to 11 alpha advertisers by ensuring campaign QA, historical data availability, system enablement, and measurement frameworks were in place. It was later rolled out to a $300M+ product.
Used an SVM on text to predict fraud from claim notes. This allowed us to automate the work of 15 FTEs.
2014-2016Developed an incrementality estimation and bidding approach using experimental lift data and an ensemble of conversion models, iteratively validating and tuning via backend tests to achieve a 10X lift in estimated incremental conversions.
2022-2025Designed and implemented a channel recommendation model to identify the most relevant YouTube channels for a brand based on campaign keywords and URL-derived context. The approach leveraged shared embedding spaces and novel clustering techniques to account for multimodal channel content, paired with a two-stage ranking system optimized for real-time querying at scale. This system reduced channel selection time by ~90%, reproduced expert decisions with >99% precision, and was patented.
2018-2021I conducted surveys and matched to Census data to model demand. I used this in a gravity model to estimate market size of sports betting in states that were considering legalizing.
2018Reversed-engineered pre-packaged GLM software, allowing us to automatically produce modeling packets. This reduced a day-long project to minutes.
2014-2016Applied Monte Carlo simulation techniques to estimate aggregate view outcomes for planned video lineups, addressing systematic underprediction caused by outlier effects. The approach enabled reliable percentile-based forecasting, was adopted in a patented solution, and improved planning accuracy for internal stakeholders.
2018-2021Priced our new small ASO product using Monte Carlo simulations.
2014Redesigned predictions of hosted players' play to improve transparency and interpretability, achieving more accurate identification of high-value players for ~80% of cases and enabling better rewards allocation.
2016-2018Built a customer lifetime value model of our commerical policy data.
2014-2016Project Titan is a set of software I wrote to make predictions for sports. This is a large project with web scraping, modeling, and architecture components.
2023-2022LinkThere's a subreddit where people ask for high-risk, high-yield, short-term loans, and various lenders fulfill these loans. My friend and I tried to make money doing this, by underwriting users based on their borrowing and user history. We scraped a ton of data from the site, modeled on this, and set up a service to notify us of good risk to lend to.
2016LinkA friend and I signed up to do a Kaggle project, about Santa's sleigh. The problem was to find Santa's shortest path, while considering a weight restriction; essentially a travelling salesperson problem on top of a knapsack problem. We approached the problem with a modified k-means and a hybrid of standard TSP algorithms.
2015LinkThis was a small project that I worked on following a financial stats class I took in grad school. I attempted to account for transaction costs in a modern portfolio theory implementation.
2013LinkEarly-in-my-career attempts to model march madness games.
2013-2016Link