Conventional battery research often grapples with the challenge of accumulating scattered knowledge and information across countless academic resources, including papers, lectures, conferences, and more. This dispersed wealth of information spans a variety of modalities, further limiting the efficiency of literature review and methodology comparison, hindering the rapid advancement of energy storage technologies. In this perspective, we cover progress and introduce a paradigm shift catalyzed by large language models. The paradigm harnesses the power of natural language processing and artificial intelligence to enable rapid information synthesis and swift insight extraction. To illustrate the transformative potential, we present a practical demonstration of a comprehensive literature review in fast charging powered by ChatGPT, showcasing how the proposed paradigm can streamline the process of synthesizing information from a vast array of sources. This perspective underscores the profound impact of large language models on battery research, ushering in an era of efficiency and accelerated innovation in energy storage technologies.