Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
With the growing number of process variation sources in deeply nano-scaled technologies, parameterized device and circuit modeling is becoming very important for chip design and verification. However, the high dimensionality of parameter space, for process variation analysis, is a serious modeling challenge for emerging VLSI technologies. These parameters correspond to various inter-die and intra-die variations, and considerably increase the difficulties of design validation. Today’s response surface models and most commonly used parameter reduction methods, such as Principal Component Analysis (PCA) and Independent Component Analysis (ICA), limit parameter reduction to linear or quadratic form and they do not address the higher order of nonlinearity among process and performance parameters. In this paper, we propose and validate a feature selection method to reduce the circuit modeling complexity associated with high parameter dimensionality. This method relies on a learning-based nonlinear sparse regression, and performs a parameter selection in the input space rather than creating a new space. This method is capable of dealing with mixed Gaussian and non-Gaussian parameters and results in a more precise parameter selection considering statistical nonlinear dependencies among input and output parameters. The application of this method is demonstrated in digital circuit timing analysis in both FinFET and Silicon Nanowire technologies. The results confirm the efficiency of this method to significantly reduce the number of required simulations while keeping estimation error small.
Jean-Pierre Hubaux, Juan Ramón Troncoso-Pastoriza, Jean-Philippe Léonard Bossuat, Apostolos Pyrgelis, David Jules Froelicher, Joao André Gomes de Sá e Sousa