Research at Nova

Advancing the field of AI through groundbreaking research and open collaboration. Our team regularly publishes in top conferences and journals.

Efficient Training of Large Language Models for Enterprise Applications

Sarah Chen, James Wilson, Michael RodriguezNeurIPS 2024 2024

We present a novel approach to training large language models that reduces computational requirements by 60% while maintaining model performance. Our method is particularly suited for enterprise applications where data efficiency is crucial.

Large Language Models
Efficient Training
Enterprise AI
Secure Multi-Party Computation for Distributed AI Systems

David Kumar, Lisa ParkICML 2024 2024

A comprehensive framework for implementing secure multi-party computation in distributed AI systems, enabling multiple parties to jointly compute AI models while keeping their input data private.

Security
Distributed Systems
Privacy
Neural Architecture Search for Resource-Constrained Environments

Emily Chang, Sarah ChenICLR 2024 2024

An innovative approach to neural architecture search that specifically targets resource-constrained enterprise environments, achieving state-of-the-art performance with minimal computational overhead.

Neural Architecture Search
Efficiency
Enterprise

Collaborate with Us

Interested in collaborating on research? We're always open to partnerships with academic institutions and industry researchers.

Contact Research Team