Software Engineer, Data Scientist, AI Engineer, Quant, Classical Pianist
My skills include...
Originally from San Francisco, I'm now a Computer Science and Data Science student at NYU building AI systems that solve tangible business problems across New York City's financial and tech landscape. I've engineered full-stack platforms and machine learning pipelines for NYC hedge funds, built automated pricing models for real estate companies, and developed NLP tools for academic research centers including NYU's Carter Journalism Institute and Yale. Whether it's automating investment research workflows that process millions of data points daily, or building AI systems that help teams answer complex questions in seconds instead of hours, I love turning messy data challenges into production-ready solutions. Beyond engineering, I'm a classical pianist with 16 years of experience, and I explore creative outlets through Muay Thai and music production.
Built equities-focused AI dashboard with Python/FastAPI integrating Grok API for factor report generation. Developed statistical analysis engine using SciPy for traditional quant metrics and portfolio analytics. Deployed full-stack system combining LLM capabilities with rigorous statistical methods for investment research.
Built firmwide AI chatbot with Python/FastAPI connecting directly to Snowflake for flexible querying across all company data. Integrated Grok API and PostgreSQL for natural language queries enabling instant access to research reports, market data, and internal documents. Production system serving entire investment staff.
Developed full-stack investor relations tool for NYC hedge fund connected to Snowflake data warehouse. Built automated data validation pipelines with PostgreSQL backend and report generation system that exports client-ready documents in existing formats, eliminating manual data pulling and validation. One-stop platform for generating all client-facing investment reports.
Developed machine learning models predicting stock price movements following short seller report releases. Built feature engineering pipeline extracting signals from report text using NLP and combining with market data. Implemented and compared logistic regression and deep Q-learning (DQL) models for binary classification of post-report price direction, achieving strong predictive performance for trading strategy development.
Built logistic regression model predicting property distress for real estate portfolio management. Developed automated pipeline processing and scoring properties daily with strong predictive accuracy, enabling proactive risk assessment and investment decisions.
Built NLP API for extracting structured data from unstructured sources using Gemini and Claude APIs. Developed automated web scraping pipeline with intelligent feature extraction and validation, transforming raw web data into clean, actionable datasets. Containerized with Docker for scalable deployment.
Built ETL monitoring pipeline validating schema integrity and detecting anomalies across millions of records daily. Developed PowerBI dashboard tracking data freshness and pipeline health for multiple datasets. Significantly reduced manual audit time with automated alerts and LLM-generated anomaly summaries.
AlphaQuest
Catenary Alternatives Asset Management
Neue Urban
Arthur L. Carter Journalism Institute
Yale University (HP Funded Research)
New York University