arXiv Analytics

Sign in

arXiv:2105.04232 [cs.LG]AbstractReferencesReviewsResources

De-homogenization using Convolutional Neural Networks

Martin O. Elingaard, Niels Aage, J. Andreas Bærentzen, Ole Sigmund

Published 2021-05-10Version 1

This paper presents a deep learning-based de-homogenization method for structural compliance minimization. By using a convolutional neural network to parameterize the mapping from a set of lamination parameters on a coarse mesh to a one-scale design on a fine mesh, we avoid solving the least square problems associated with traditional de-homogenization approaches and save time correspondingly. To train the neural network, a two-step custom loss function has been developed which ensures a periodic output field that follows the local lamination orientations. A key feature of the proposed method is that the training is carried out without any use of or reference to the underlying structural optimization problem, which renders the proposed method robust and insensitive wrt. domain size, boundary conditions, and loading. A post-processing procedure utilizing a distance transform on the output field skeleton is used to project the desired lamination widths onto the output field while ensuring a predefined minimum length-scale and volume fraction. To demonstrate that the deep learning approach has excellent generalization properties, numerical examples are shown for several different load and boundary conditions. For an appropriate choice of parameters, the de-homogenized designs perform within $7-25\%$ of the homogenization-based solution at a fraction of the computational cost. With several options for further improvements, the scheme may provide the basis for future interactive high-resolution topology optimization.

Related articles: Most relevant | Search more
arXiv:1912.05687 [cs.LG] (Published 2019-12-11)
REFINED (REpresentation of Features as Images with NEighborhood Dependencies): A novel feature representation for Convolutional Neural Networks
arXiv:1603.03657 [cs.LG] (Published 2016-03-11)
Efficient forward propagation of time-sequences in convolutional neural networks using Deep Shifting
arXiv:1806.02012 [cs.LG] (Published 2018-06-06)
A Peek Into the Hidden Layers of a Convolutional Neural Network Through a Factorization Lens