Why is progress slow in generative AI for biology?

9 points by lysozyme 10 months ago | 2 comments
  • owenpalmer 10 months ago
    > The final step involves extracting and testing the molecule via biochemical assay, a process that is time-consuming, costly, and often requires specialized equipment.

    I'm a student currently working with my college's robotics automation department to explore how to automate bioengineering research. I'm still new to the bioengineering sphere, so I'm curious what the primary challenges are for automating assays and what is involved in the process. If any HN biologist have insight on this, I would love to hear their thoughts!

    • jryb 10 months ago
      Suppose your fancy new robot is wildly successful and can let me examine 2-3 orders of magnitude more sequences - that's still an infinitesimal fraction of all possible proteins with the same function, and now my DNA synthesis and sequencing costs are also 2-3 orders of magnitude higher - that's really non-trivial, and given that your company isn't selling a lot of these robots (who even needs this scale?), they aren't going to be cheap either.

      How do I justify spending all this money? I need some theory of why this search is going to give me anything other than incremental improvements in activity or whatever metric I care about, but rational design can only take me so far. Generative models aren't going to give me a step change in activity. Why am I confident that the set of proteins I'm testing have megabucks of potential?

      So sure, limitations in automation are an obstacle to bigger scale, but we often can't use the scale that's already achievable. There's certainly room for improvement in the automation space, but unlocking scale is not the only problem you need to solve.