Research at Nova
Advancing the field of AI through groundbreaking research and open collaboration. Our team regularly publishes in top conferences and journals.
Sarah Chen, James Wilson, Michael Rodriguez • NeurIPS 2024 2024
We present a novel approach to training large language models that reduces computational requirements by 60% while maintaining model performance. Our method is particularly suited for enterprise applications where data efficiency is crucial.
David Kumar, Lisa Park • ICML 2024 2024
A comprehensive framework for implementing secure multi-party computation in distributed AI systems, enabling multiple parties to jointly compute AI models while keeping their input data private.
Emily Chang, Sarah Chen • ICLR 2024 2024
An innovative approach to neural architecture search that specifically targets resource-constrained enterprise environments, achieving state-of-the-art performance with minimal computational overhead.
Collaborate with Us
Interested in collaborating on research? We're always open to partnerships with academic institutions and industry researchers.
Contact Research Team