Student Scholar Symposium Abstracts and Posters

Document Type

Chapman access only poster or presentation

Publication Date

Fall 12-4-2024

Faculty Advisor(s)

Emad Arasteh

Abstract

Transaction-level modeling (TLM) is a high-level abstraction that models hardware and software and their detailed interactions, enabling performance analysis in the early stages of the embedded system design. For efficient embedded system design, architects design, configure, and simulate TLM models to explore the design space and find the optimal candidate for implementation. Running a multitude of TLM simulations is a time-consuming and inefficient process. The objective of our project is to design a machine-learning model that captures and learns the complexities of the TLM model of a deep neural network (DNN). Specifically, we focus on predicting simulated time for SystemC TLM-2.0 loosely-timed contention-aware (LT-CA) models, which consider the effect of memory and interconnect contention in system-level design and performance estimation. We design linear regression and neural network models that describe the relationship between system-level configuration knobs, such as computational capacity and memory access, on simulated time of the TLM-2.0 LT-CA models of GoogLeNet, a state-of-the-art DNN for image recognition. Our predictive model analyzes simulations produced through the TLM simulations as data used for training and validation to learn patterns and relationships within the data. Our experimental results on the TLM-2.0 LT-CA models show the high accuracy of our proposed predictive model and deliver promising results to enhance the overall efficiency of system-level modeling.

Comments

Presented at the Fall 2024 Student Scholar Symposium at Chapman University.

Download from off-campus (Chapman ID required)

Share

COinS