Skip to main content

Showing 1–14 of 14 results for author: Fung, V

Searching in archive cond-mat. Search in all archives.
.
  1. arXiv:2603.16631  [pdf

    cond-mat.mtrl-sci

    Ligand-Controlled Phonon Dynamics in CsPbBr3 Nanocrystals Revealed by Machine-Learned Interatomic Potentials

    Authors: Seungjun Cha, Chen Wang, Victor Fung, Guoxiang Hu

    Abstract: Halide perovskite nanocrystals are leading candidates for next-generation optoelectronics, yet the role of surface ligands in controlling their phonon dynamics remains poorly understood. These dynamics critically govern nonradiative relaxation, energy up-conversion, and phonon-assisted anti-Stokes emission. Conventional ab initio methods, while accurate, are computationally infeasible for experime… ▽ More

    Submitted 17 March, 2026; originally announced March 2026.

  2. arXiv:2602.21533  [pdf

    cond-mat.mtrl-sci cs.LG

    Reasoning-Driven Design of Single Atom Catalysts via a Multi-Agent Large Language Model Framework

    Authors: Dong Hyeon Mok, Seoin Back, Victor Fung, Guoxiang Hu

    Abstract: Large language models (LLMs) are becoming increasingly applied beyond natural language processing, demonstrating strong capabilities in complex scientific tasks that traditionally require human expertise. This progress has extended into materials discovery, where LLMs introduce a new paradigm by leveraging reasoning and in-context learning, capabilities absent from conventional machine learning ap… ▽ More

    Submitted 24 February, 2026; originally announced February 2026.

  3. arXiv:2602.20959  [pdf, ps, other

    cond-mat.mtrl-sci

    Determining Atomic Structure from Spectroscopy via an Active Learning Framework

    Authors: Ian Slagle, Faisal Alamgir, Victor Fung

    Abstract: Determining atomic structure from spectroscopic data is central to materials science but remains restricted to a limited set of techniques and material classes, largely due to the computational cost and complexity of structural refinement. Here we introduce ActiveStructOpt, a general framework that integrates graph neural network surrogate models with active learning to efficiently determine candi… ▽ More

    Submitted 24 February, 2026; originally announced February 2026.

  4. arXiv:2602.19592  [pdf, ps, other

    cond-mat.mtrl-sci

    Improving Reliability of Machine Learned Interatomic Potentials With Physics-Informed Pretraining

    Authors: Qianyu Zheng, Victor Fung

    Abstract: Machine learned interatomic potentials (MLIPs) have emerged as powerful tools for molecular dynamics (MD) simulations with their competitive accuracy and computational efficiency. However, MLIPs are often observed to exhibit un-physical behavior when encountering configurations which deviate significantly from their training data distribution, leading to simulation instabilities and unreliable dyn… ▽ More

    Submitted 23 February, 2026; originally announced February 2026.

  5. arXiv:2509.21694  [pdf, ps, other

    cond-mat.mtrl-sci

    Scalable Foundation Interatomic Potentials via Message-Passing Pruning and Graph Partitioning

    Authors: Lingyu Kong, Jaeheon Shim, Guoxiang Hu, Victor Fung

    Abstract: Atomistic foundation models (AFMs) have great promise as accurate interatomic potentials, and have enabled data-efficient molecular dynamics simulations with near quantum mechanical accuracy. However, AFMs remain markedly slower at inference and are far more memory-intensive than conventional interatomic potentials, due to the need to capture a wide range of chemical and structural motifs in pre-t… ▽ More

    Submitted 25 September, 2025; originally announced September 2025.

  6. arXiv:2509.08418  [pdf, ps, other

    cond-mat.mtrl-sci cs.LG

    Facet: highly efficient E(3)-equivariant networks for interatomic potentials

    Authors: Nicholas Miklaucic, Lai Wei, Rongzhi Dong, Nihang Fu, Sadman Sadeed Omee, Qingyang Li, Sourin Dey, Victor Fung, Jianjun Hu

    Abstract: Computational materials discovery is limited by the high cost of first-principles calculations. Machine learning (ML) potentials that predict energies from crystal structures are promising, but existing methods face computational bottlenecks. Steerable graph neural networks (GNNs) encode geometry with spherical harmonics, respecting atomic symmetries -- permutation, rotation, and translation -- fo… ▽ More

    Submitted 10 September, 2025; originally announced September 2025.

  7. arXiv:2509.03401  [pdf

    cond-mat.mtrl-sci

    A Comprehensive Assessment and Benchmark Study of Large Atomistic Foundation Models for Phonons

    Authors: Md Zaibul Anam, Ogheneyoma Aghoghovbia, Mohammed Al-Fahdi, Lingyu Kong, Victor Fung, Ming Hu

    Abstract: The rapid development of universal machine learning potentials (uMLPs) has enabled efficient, accurate predictions of diverse material properties across broad chemical spaces. While their capability for modeling phonon properties is emerging, systematic benchmarking across chemically diverse systems remains limited. We evaluate six recent uMLPs (EquiformerV2, MatterSim, MACE, and CHGNet) on 2,429… ▽ More

    Submitted 3 September, 2025; originally announced September 2025.

    Journal ref: Advanced Intelligent Discovery 2025, 0, e202500075

  8. arXiv:2504.10655  [pdf

    cond-mat.mtrl-sci cs.AI cs.LG

    MatterTune: An Integrated, User-Friendly Platform for Fine-Tuning Atomistic Foundation Models to Accelerate Materials Simulation and Discovery

    Authors: Lingyu Kong, Nima Shoghi, Guoxiang Hu, Pan Li, Victor Fung

    Abstract: Geometric machine learning models such as graph neural networks have achieved remarkable success in recent years in chemical and materials science research for applications such as high-throughput virtual screening and atomistic simulations. The success of these models can be attributed to their ability to effectively learn latent representations of atomic structures directly from the training dat… ▽ More

    Submitted 14 April, 2025; originally announced April 2025.

  9. arXiv:2504.06249  [pdf, other

    cond-mat.mtrl-sci

    Electronic Structure Guided Inverse Design Using Generative Models

    Authors: Shuyi Jia, Panchapakesan Ganesh, Victor Fung

    Abstract: The electronic structure of a material fundamentally determines its underlying physical, and by extension, its functional properties. Consequently, the ability to identify or generate materials with desired electronic properties would enable the design of tailored functional materials. Traditional approaches relying on human intuition or exhaustive computational screening of known materials remain… ▽ More

    Submitted 8 April, 2025; originally announced April 2025.

  10. arXiv:2503.01227  [pdf, other

    cond-mat.mtrl-sci cs.LG

    Pre-training Graph Neural Networks with Structural Fingerprints for Materials Discovery

    Authors: Shuyi Jia, Shitij Govil, Manav Ramprasad, Victor Fung

    Abstract: In recent years, pre-trained graph neural networks (GNNs) have been developed as general models which can be effectively fine-tuned for various potential downstream tasks in materials science, and have shown significant improvements in accuracy and data efficiency. The most widely used pre-training methods currently involve either supervised training to fit a general force field or self-supervised… ▽ More

    Submitted 3 March, 2025; originally announced March 2025.

  11. arXiv:2408.07213  [pdf, other

    cond-mat.mtrl-sci

    Representation-space diffusion models for generating periodic materials

    Authors: Anshuman Sinha, Shuyi Jia, Victor Fung

    Abstract: Generative models hold the promise of significantly expediting the materials design process when compared to traditional human-guided or rule-based methodologies. However, effectively generating high-quality periodic structures of materials on limited but diverse datasets remains an ongoing challenge. Here we propose a novel approach for periodic structure generation which fully respect the intrin… ▽ More

    Submitted 13 August, 2024; originally announced August 2024.

  12. arXiv:2406.13163  [pdf, other

    cond-mat.mtrl-sci cs.AI cs.CL

    LLMatDesign: Autonomous Materials Discovery with Large Language Models

    Authors: Shuyi Jia, Chao Zhang, Victor Fung

    Abstract: Discovering new materials can have significant scientific and technological implications but remains a challenging problem today due to the enormity of the chemical space. Recent advances in machine learning have enabled data-driven methods to rapidly screen or generate promising materials, but these methods still depend heavily on very large quantities of training data and often lack the flexibil… ▽ More

    Submitted 18 June, 2024; originally announced June 2024.

  13. arXiv:2207.13227  [pdf

    cond-mat.mtrl-sci cs.LG

    Atomic structure generation from reconstructing structural fingerprints

    Authors: Victor Fung, Shuyi Jia, Jiaxin Zhang, Sirui Bi, Junqi Yin, P. Ganesh

    Abstract: Data-driven machine learning methods have the potential to dramatically accelerate the rate of materials design over conventional human-guided approaches. These methods would help identify or, in the case of generative models, even create novel crystal structures of materials with a set of specified functional properties to then be synthesized or isolated in the laboratory. For crystal structure g… ▽ More

    Submitted 26 July, 2022; originally announced July 2022.

    Comments: 16 pages and 9 figures in the main text

  14. arXiv:2106.03013  [pdf

    cond-mat.mtrl-sci cs.LG

    Inverse design of two-dimensional materials with invertible neural networks

    Authors: Victor Fung, Jiaxin Zhang, Guoxiang Hu, P. Ganesh, Bobby G. Sumpter

    Abstract: The ability to readily design novel materials with chosen functional properties on-demand represents a next frontier in materials discovery. However, thoroughly and efficiently sampling the entire design space in a computationally tractable manner remains a highly challenging task. To tackle this problem, we propose an inverse design framework (MatDesINNe) utilizing invertible neural networks whic… ▽ More

    Submitted 5 June, 2021; originally announced June 2021.