MIT spinout exploring fundamentally different neural network architectures inspired by biological neural circuits. Their Liquid Foundation Models use continuous-time dynamics rather than fixed-weight transformers, promising better efficiency and adaptability.