Authors: Mayank Mishra, Matt Stallone, Gaoyuan Zhang, Yikang Shen, Aditya Prasad, Adriana Meza Soria, Michele Merler, Parameswaran Selvam, Saptha Surendran, Shivdeep Singh, Manish Sethi, Xuan-Hong Dang, Pengyuan Li, Kun-Lung Wu, Syed Zawad, Andrew Coleman, Matthew White, Mark Lewis, Raju Pavuluri, Yan Koyfman, Boris Lublinsky, Maximilien de Bayser, Ibrahim Abdelaziz, Kinjal Basu, Mayank Agarwal, Yi Zhou, Chris Johnson, Aanchal Goyal, Hima Patel, Yousaf Shah, Petros Zerfos, Heiko Ludwig, Asim Munawar, Maxwell Crouse, Pavan Kapanipathi, Shweta Salaria, Bob Calio, Sophia Wen, Seetharami Seelam, Brian Belgodere, Carlos Fonseca, Amith Singhee, Nirmit Desai, David D. Cox, Ruchir Puri, Rameswar Panda
Published on: May 07, 2024
Impact Score: 7.8
Arxiv code: Arxiv:2405.04324
Summary
- What is new: Introduction of the Granite series of decoder-only code models that are trained on code from 116 programming languages and range from 3 to 34 billion parameters.
- Why this is important: The need for more versatile and capable Large Language Models (LLMs) for software development to aid in tasks such as code generation, bug fixing, and code explanation.
- What the research proposes: The Granite Code models, a family of models optimized for a wide range of software development tasks, demonstrating state-of-the-art performance in code generative tasks.
- Results: Granite Code models reach state-of-the-art performance across a comprehensive set of coding tasks and are released under an Apache 2.0 license for wide accessibility.
Technical Details
Technological frameworks used: Decoder-only LLM framework
Models used: Granite Code models family ranging from 3 to 34 billion parameters
Data used: Code written in 116 programming languages
Potential Impact
Software development environments, enterprise software development workflows, and companies offering development tools or platforms could greatly benefit or face disruption.
Want to implement this idea in a business?
We have generated a startup concept here: CodeCrafter AI.
Leave a Reply