Show HN: Keep your PyTorch model in VRAM by hot swapping code

77 points by valine 2 months ago | 7 comments
  • NitpickLawyer 2 months ago
    We use python notebooks for that functionality in the early stages of script testing. Load a cell up top with the model, then do your stuff below, and once things look good convert it to a normal python script.
    • pizza 2 months ago
      Tensor visualizer app itself already looks pretty interesting
      • valine 2 months ago
        Thanks, I will do a deep writeup on that at some point.
        • kombine 2 months ago
          Are you running both DearImGui visualisation and training locally? If not, how can one use it in the client-server mode? I think this is the most common requirement for visualisation libraries in Deep Learning.
          • valine 2 months ago
            The rendering is done with OpenGL, and for remote viewing I just render to an offscreen framebuffer and stream it back to the client with WebRTC. The code for that isn’t public yet, still needs some cleanup.
        • iaw 2 months ago
          Yeah, sadly the link to their visualizations is gated behind X.com
          • CheeksTheGeek 2 months ago
            you can use xcancel.com by adding cancel after the x url
        • 2 months ago
          • mathibela 2 months ago
            [flagged]
            • mathibela 2 months ago
              [flagged]