Why Recursion Pharmaceuticals abandoned cell painting for brightfield imaging
66 points by abhishaike 6 months ago | 11 comments- Mistletoe 6 months agoI find this fascinating. I used to do high content screening of cells with machines like the Perkin Elmer Operetta and lots of dyes. I would have never thought machine learning would make us go full circle back to brightfield.
The meat of this article:
> The trend of ML in general over the past 15~ years has been to strip away more and more of the biases you’ve encoded about your dataset as you feed it into a model. Computer vision went from hand-crafted interpretable features (e.g. number of circles, number of black pixels when thresholded, etc), to hand-crafted uninterpretable features (e.g. scale invariant feature transform), to automatically extracted uninterpretable features (e.g. hidden dimensions of a convolutional neural network). In other words, the bitter lesson; pre-imposing structure on your data is useful for a human, but detrimental to a machine.
The gist of it I understood is that ML could already tell and label the organelles and cell structure in brightfield imaging without the dyes and that adding those on top just muddled and hindered it.
- whatshisface 6 months agoIt makes a lot of sense in retrospect because all the structures are visible on the brightfield image - it's just more annoying for a human being to pick out the features on thousands of slides.
I wonder what the dye budgets and spectral bands will go to now that there is no need to mark the boundary of the nucleus. I bet there are a lot of things that are not visible in the refractive index that you could dye.
- whatshisface 6 months ago
- _ihaque 6 months agoMy team did part of the work described in TFA, AMA.
- _ihaque 6 months agoI should point out that we are hiring! Software engineering, data science/ML, and IT positions are available in Salt Lake City, Toronto, New York, and London:
- selimthegrim 6 months agoDo you guys have a cooldown period for applications?
- _ihaque 6 months agoNot that I know of! Note that the positions we list are typically in different teams so it's worth reading the descriptions to make sure you're picking whichever one is most appropriate to your experience+interests (and, as a corollary, if you weren't a fit for a previous position, you may be one for one that comes up in the future).
- ldsldslds 6 months agothey dont respond to cold apps in my experience, except canned rejection after a few months for some reason
- _ihaque 6 months ago
- selimthegrim 6 months ago
- shaunregenbaum 6 months agoDo you guys have high hopes for video data? Were there early results that showed promise for kinetic data?
- _ihaque 6 months agoYes, we (and plenty of others) have prior data showing that perturbations (things that you can do to cells, like knocking out genes or putting drug-candidates on them) have different effects at different times, so we are excited to see the potential for time course imaging.
Note that "video" is a little different here than the way we all usually think of it - think "minutes or hours between frames" not "frames per second".
- _ihaque 6 months ago
- qtd_science 6 months ago[dead]
- _ihaque 6 months ago
- cassepipe 6 months ago[flagged]