Maxim Zaslavsky
About.
I'm a computer scientist using machine learning to approach problems in biology and healthcare.
Research
I'm a computer science PhD student at Stanford, supported by the NSF GRFP fellowship, after an undergrad at Princeton. My research applies machine learning to immunology and genetics. In industry R&D, I hold multiple pending patents as lead inventor.
Product management
I've led engineering teams to ship and grow products to tens of thousands of users.
I worked as a product manager and software engineer at a unicorn medical technology startup, Butterfly Network. Our product made a complex clinical tool accessible to the masses, earning an Apple Design Award and SXSW Best of Show Award, landing on the cover of the International New York Times and the Science Times.
Full-stack software engineering
I'm recognized as one of the top 2% of Stack Overflow users, by reputation points, with moderator privileges.
Data science
I worked as a data scientist and software engineer at one of the nation's largest hospitals. Previously, I interned as a data scientist at the New York Times and as a software engineer building data services at Google Nest. I'm skilled at integrating prediction into the day-to-day business and making the output of predictive models interpretable to decision makers.
Other
In my free time, I play music — mostly jazz piano, these days. I love to travel and read.
Please get in touch. I live in the Bay Area and would love to meet.
Products.
Butterfly iQ (2018 - 2019)
I was the product manager holistically responsible for Butterfly’s mobile ultrasound imaging software — determining what we build and how we build it. I reported directly to the head of product.
Butterfly iQ is the world’s first whole-body imager. iQ fits in your pocket, connects to your iPhone, and lets you see inside the body, for $2000 — a fraction of the price of a traditional ultrasound machine.
Butterfly’s mission is to make medical imaging more accessible, affordable, and intuitive. In the US, ultrasound is becoming a pillar of the physical examination. As for abroad, two thirds of the world has no access to even basic medical imaging. Handheld, simple-to-use ultrasound at a $2000 price point can help.
What makes Butterfly unique is that the piezoelectric crystals typically used in ultrasound have been replaced by custom-designed semiconductor chips, controlled entirely in software. Rather than swapping ultrasound probes to scan different parts of the body, a clinician scans with a single iQ probe and simply adjusts the scanning preset on their iPhone.
iQ is a medical device with the paradigms that make an iPhone app, an iPhone app. Connect your iQ and immediately start scanning. Swipe on the screen to adjust your image. Easily save to cloud storage and share with colleagues.
I was responsible for Butterfly’s iPhone and iPad software, which programs the iQ device, reconstructs images from ultrasound data, uploads captured images and videos to Butterfly secure cloud storage, and enables sharing and commenting for groups of physicians.
I managed the iQ software product from several months before launch to approximately one year after launch. I shipped:
- Core imaging features: the first Butterfly iQ app for iPhone and then for iPad; onboarding education revealed as the user learns the iQ scanning workflow; improved capture and annotation tools; and advanced imaging modes.
- Sales and marketing initiatives: a referral program that generated hundreds of thousands in revenue without a dime of incentives; safe deidentified image sharing; and the foundation for SaaS subscription management.
- Underlying infrastructure: speedy uploads, hardware-software compatibility, automatic diagnostic health checks, and lots and lots of usability refinements to get the iQ experience just right.
We won an Apple Design Award ("the Oscars of app development"), the SXSW Best of Show and Best in Health, Med, and Biotech awards, and other awards during my tenure.
Butterfly iQ was featured above the fold on the cover of the International New York Times, on the cover of the Science Times, and elsewhere.
I was also responsible for our software testing and release process along with my engineering partners, as we maintained a high-velocity release cadence, rare in medical devices.
Butterfly Tele-Guidance (2018)
Before I managed the core iQ product, my first assignment at Butterfly was to conceive of and build a way to empower non-ultrasound-trained clinicians with ultrasound insights for the first time, using telemedicine.
As a software engineer and product manager, I specced and built Butterfly Tele-Guidance, a green-field project combining ultrasound imaging with streaming video, computer vision, and augmented reality.
Imagine if any medical professional could unlock answers with ultrasound. Our technology emulates the feeling of an expert over your shoulder, guiding your hand as you scan. See press and footage from CES 2019.
Ultrasound is a game of millimeters. With Tele-Guidance, a novice scanner wouldn't need careful training to understand expert ultrasound terminology for maneuvering the probe. Instead, Tele-Guidance tracks the iQ probe’s position in the novice scanner’s iPhone camera feed. The remote expert can command precise adjustments in augmented reality, leading the novice scanner to the right image and insight.
Within my first 10 weeks, I built a fully functional iOS and web prototype to showcase at a national ultrasound trade show. Then I progressively refined the product through methodical user testing.
I worked as a solo engineer and product manager, with the help of a part-time product manager, pointers from several ultrasound software designers and engineers across the company, and guidance from the head of product and the chief medical officer.
Notable technologies used: Python, Typescript, React, WebRTC, Swift, SceneKit, Three.js, ArUco.
I ran the process to hire and onboard a tech lead to take over my day-to-day responsibilities, before I moved on to the core Butterfly iQ product.
Research.
I’m fascinated by immunology and novel therapeutics to treat cancer and autoimmune diseases. My research goal is to reframe biology questions as computational problems, address them using statistics and machine learning, and translate those results into medical insight and treatments.
Icahn Institute, Mount Sinai School of Medicine (2016 - 2017)
Cancer researchers are investigating why only a fraction of patients benefit from the new wave of cancer immunotherapies. They believe the composition of the environment around a tumor — especially which immune cells live in the region — may play a role.
But figuring out what’s in the “tumor microenvironment” is slow and expensive. Traditional methods (immunohistochemistry, flow cytometry, or mass cytometry) only measure up to a few dozen "markers" at a time — limiting the number of immune cell types identified.
To get a clearer picture, we can actually guess the abundances of different immune cell types. First, you'd measure overall gene expression in a messy mixture extracted from the microenvironment. Then you could computationally deconvolve that mixture into estimates of which components created that mixture. However, my analyses suggest that existing techniques have some notable shortcomings.
Some methods dial in on marker genes with such strict criteria that the biological function of the genes does not turn out to be related to the cell types they're supposed to identify. Others simply struggle to separate challenging mixtures containing cells of similar types, while reporting high confidence in their mistaken estimates.
A key fundamental problem seems to be that existing methods separate mixtures straight into their most granular subcomponents. These state of the art methods jump straight to estimating the abundances of very precise cell types that are actually so similar to each other.
In a messy mixture, that’s difficult to get right. And after this risky jump to the most granular level of estimates, these methods don't provide a way to "roll up" those guesses into broader cell type categories, which could be informative even when it's challenging to make a determination at a finer level.
With Professor Jeff Hammerbacher and colleagues at the Icahn Institute at Mount Sinai, I developed a Bayesian hierarchical mixture deconvolution method tailored to the kind of RNA-seq gene expression data you would measure in the tumor microenvironment.
Here’s the idea: to make better estimates of what's in a gene expression mixture, first we find and leverage the similarities between the gene expression profiles of different cell types. This approach forms a "hierarchy" of immune cell types.
We can understand the uncertainty of our predictions at every level of granularity in this hierarchical tree. In a real-world sample of all sorts of cells in a tumor microenvironment, our method might be too uncertain to separate a mixture into precise individual cell types, but confident enough to give an accurate answer at a higher level of abstraction — such as separating only into B cells, CD4 T cells, and CD8 T cells. A user can interrogate complex mixtures at increasingly fine levels of granularity to an acceptable level of uncertainty in the prediction.
We processed and analyzed genomic data to motivate and implement this new hierarchical approach. Once we learn the similarities between cell types from the data, our generative model simulates how these subcomponents could be added together to produce the mixtures in question. Our proof of concept produced significantly more accurate estimates of immune cell types in challenging mixtures.
Using the Cancer Genome Atlas clinical dataset, we're assessing the association between immune cell infiltration into the tumor microenvironment and a patient’s response to immunotherapy. The results could shed light on a key mystery that prevents broader benefit from new cancer therapies.
My colleagues and I presented the work at a UPenn symposium in May 2017, at the International Cancer Immunotherapy Conference in Germany in September 2017, and at the Society for Immunotherapy of Cancer meeting in D.C. in November 2017. [Preprint, Code]
Projects.
Tens of thousands of Princeton students rely on ReCal, the app that helps you create a schedule from the hundreds of courses offered every semester. I built and launched ReCal with two classmates.
Then I founded a university-sponsored incubator to empower student developers with launch assistance, funding, and long-term maintenance.
In 2014 a friend and I ran the Princeton Silicon Valley TigerTrek, the trip that brings 20 undergrads to Silicon Valley for a week of off-the-record Q&A with the legends of the Valley. We met with Peter Thiel, Meg Whitman, John Doerr, Sal Khan, and others. Now in its 9th year at Princeton, the trip community is steadily growing.
Sift is a git-inspired personal search utility that gives you full text search over hard-to-search files like Word, LaTeX, and PDFs. Sift also lets you fully index directories before you offload them to long-term cloud storage. Written to be portable and easily extensible to new file types.
Legato Network is my piano tracking and sharing project for helping traveling musicians find pianos to play on.
Writing.
Every year or two, I write an update letter to stay in touch with old friends and mentors. I post them here because they’re a chronicle of what I’m working on and thinking about now:
Finally, a clip from a recent jazz concert (I'm on piano):