I am a machine learning researcher interested in deep generative models, reinforcement learning, and AI for clean energy. I am a research scientist at the National Renewable Energy Lab (NREL) where I get to apply my work to problems such as building energy management, weather and power forecasting, and discovering novel enzymes. I’m currently studying methods (foundation models for science and neuro-symbolic reinforcement learning) and evaluation strategies for out-of-distribution generalization. NREL is a DOE national lab that uses AI and supercomputing to help tackle the climate crises, which I believe to be one of the most pressing challenges of our time. I graduated with my Ph.D. in Computer Science from the University of Florida in December 2021.

[CV][Google scholar] [Github]

Interested in doing a graduate research internship or postdoc with me? Please reach out!

Contact: pemami[at]nrel[dot][gov]

News

  • October 2023 One paper accepted in the journal Neural Computation
  • Sept 2023 BuildingsBench is accepted at NeurIPS’23!
  • July 2023 One paper accepted at IEEE CDC’23
  • July 2023 We released BuildingsBench, a platform for large-scale pretraining + finetuning of building load forecasting models
  • May 2023 I am now a research scientist at NREL!
  • March 2023 Our manuscript has been accepted by Machine Learning: Science & Technology

Recent Papers

Non-Stationary Policy Learning for Multi-Timescale Multi-Agent Reinforcement Learning

IEEE CDC'23 (To appear)
Framework for learning non-stationary policies in multi-timescale MARL by leveraging periodicity.

BuildingsBench: A Large-Scale Dataset of 900K Buildings and Benchmark for Short-Term Load Forecasting

arXiv preprint
Platform for studying large-scale pretraining + zero-shot generalization/finetuning of building load forecasting models.

Plug & Play Directed Evolution of Proteins with Gradient-based Discrete MCMC

Machine Learning: Science & Technology, 2023
Also presented at NeurIPS’22 Workshop on Machine Learning in Structural Biology

A fast MCMC sampler for discovering variants by mixing and matching unsupervised evolutionary sequence models with supervised models that map sequence to protein function.

Efficient Iterative Amortized Inference for Learning Symmetric and Disentangled Multi-Object Representations

ICML, 2021
An efficient slot-based hierachical VAE for learning unordered, symmetric, and disentangled scene representations.

See my CV for a complete list including older publications.

Leeuwenberg, E. L. J. "A Perceptual Coding Language for Visual and Auditory Patterns." The American Journal of Psychology, vol. 84, no. 3, 1971, pg. 338.