UCLA researchers have developed an AI system that turns fragmented electronic health records (EHR) normally in tables into readable narratives, allowing artificial intelligence to make sense of complex patient histories and use these narratives to perform clinical decision support with high accuracy. The Multimodal Embedding Model for EHR (MEME) transforms tabular health data into “pseudonotes” that mirror clinical documentation, allowing AI models designed for text to analyze patient information more effectively.