Authors: Matthew DeLorenzo, Animesh Basak Chowdhury, Vasudev Gohil, Shailja Thakur, Ramesh Karri, Siddharth Garg, Jeyavijayan Rajendran
Published on: February 05, 2024
Impact Score: 8.07
Arxiv code: Arxiv:2402.03289
Summary
- What is new: An automated transformer decoding algorithm that uses Monte Carlo tree search for lookahead to produce compilable, functionally correct, and PPA-optimized register transfer level (RTL) code.
- Why this is important: Existing large language models struggle to generate RTL code that is free of compilation failures and optimized for power, performance, and area (PPA) due to a lack of PPA awareness.
- What the research proposes: A new decoding technique that integrates Monte Carlo tree search with transformer decoding to guide the language model in generating better RTL code.
- Results: The technique significantly outperforms existing methods, achieving a 31.8% improvement in area-delay product for a 16-bit adder, demonstrating its ability to generate functionally correct code that is also PPA efficient.
Technical Details
Technological frameworks used: Transformer-based language models with Monte Carlo tree search for decoding.
Models used: Fine-tuned large language models on RTL codesets.
Data used: RTL codesets for empirical evaluation.
Potential Impact
This research could disrupt markets involved in integrated circuit design, electronic design automation (EDA) companies, and potentially impact companies specializing in hardware acceleration or those that heavily invest in custom chip design.
Want to implement this idea in a business?
We have generated a startup concept here: OptiCodeAI.
Leave a Reply