In the past, the need to create and automate the workflow with all the necessary components for parametric design and system evaluation in standardized processes was a bottleneck in the product development process.
Ansys optiSLang offers GUI-supported binary interfacing with major CAE software tools used in virtual product development – others can be connected via scripting, text-based interfaces or custom integrations.
The software delivers access to the following:
- CAD (CATIA, Siemens NX, PTC Creo®, SOLIDWORKS, etc.)
- CAE (Ansys solvers, Abaqus, Simcenter Amesim™, etc.)
- Scripting (MathWorks® MATLAB®, Python™, etc.)
- Desktop tools (Microsoft® Excel, etc.)
- Repositories/databases (Ansys Minerva, etc.)
- In-house solvers
Ansys optiSLang also supports various high-performance computing (HPC) strategies. You can submit jobs from Microsoft® Windows® to Linux (or vice versa), or use your queueing system, Ansys HPC or Ansys Cloud submitting capabilities.
Different (parametric) environments can be connected and combined into one automatized parametric workflow for simulation-driven product development.
Ansys optiSLang offers integration with Ansys Workbench. Here, you can easily access optiSLang modules for design evaluation with drag-and-drop functionality. With minimal effort, sensitivity studies, optimization or robustness evaluation can be set up. The same is true when using Ansys Electronics Desktop. Here, a wizard in AEDT supports setup of an optiSLang workflow with only a few mouse clicks.
No matter which environment you use with optiSLang, you’ll find the same settings and dialogs: You and your colleagues need only learn one tool. Collaborative work and exchange of know-how is built-in.
Simulation Workflows and Process Automation
Process automation and integration, as well as access to best possible parametric simulation models, are the keys to successful CAE-based parametric studies. Ansys optiSLang has an intuitive, graphical user interface that lets you connect computer-aided design tools in a way that captures both the simulation process automation and workflow generation, as well as wizard-driven modules to run sensitivity studies or robust design optimization.
The graphical user interface supports the workflow approach visually via single building blocks and algorithms, which are graphically coupled in order to show parameter flow, dependencies and scheduling. The relationships can be determined and controlled in one context. Easy-to-understand charts, as well as control panels, are displayed at the same time. This enables full access and traceability of the complete workflow. The user can connect any complex simulation process of CAE solvers and pre- and post-processors in heterogeneous networks or clusters. The simulations are automated either in a single solver process chain or in very complex multidisciplinary/multidomain flows. Even performance maps and their appraisal can be part of standardized workflows.
Once a simulation process and/or workflow is set up, the best-practice knowledge is captured and can be shared.
Ansys optiSLang provides Python, web, text and command line interfaces to enable:
- Automatic creation
- Monitoring and remote control
For projects originating within optiSLang or from external sources, the usage within custom applications is secured. Ansys optiSLang projects can be integrated into customized platforms. Repetitive and pervasive tasks can be standardized and automatized.
Simulation workflows can be democratized through optiSLang. The power of simulation can now be used by those unfamiliar with simulation.
Design and Data Exploration
Do you agree with the statement: “If you don’t understand how input affects response, you can’t optimize it”?
Simulation plays a big role in understanding products and processes. Parameter- based variation analysis adds a layer of insight into how your design or process behaves under variability. Connecting parametric modeling to the powerful techniques of design variation and design exploration is a big step in engineering, which helps you to better understand and optimize your product in less time.
The optiSLang wizard guides you in defining design variables by boundary conditions or by possible discrete values. In multidisciplinary optimization tasks, the number of design variables can often be very large. With the help of powerful design of experiment (DOE) algorithms and correlation analysis, optiSLang’s sensitivity module automatically identifies those variables that effectively contribute to response variability. Based on this identification, the number of design variables will be decisively reduced and an efficient design exploration and optimization can be conducted. Additionally, sensitivity analysis creates the basis to appropriately formulate the optimization task, with respect to the choice and number of objectives, their weighting or possible constraints.
Ansys optiSLang provides an automated workflow for sensitivity analysis to create the best possible correlation analysis. In relationship with the available data or simulation points, the workflow automatically reduces the number of important input variables and generates the best possible approximation model (metamodel) that shows how input variability affects response variability. A criterion of the optimal metamodel is the ability to forecast response variability for a specific set of data points and then deduce the best possible metamodel, known as the metamodel of optimal prognosis (MOP). The MOP workflow helps you efficiently use your data and parametric simulation studies and improve your design as much as possible. Adding leading-edge artificial intelligence (AI)-based machine learning algorithms into the MOP competition opens up even higher dimensions of parameters and higher dimensions of data points, which can be considered and are of particular interest for digital twin or autonomous drive applications. Read more about it in Reduced-Order Modeling.
Understanding your design or your product data becomes possible thanks to the use of intuitive and interactive post-processing and visualization tools. Knowing how performance is related to design inputs and operating parameters is key to simulation-driven product development and is a strong foundation for innovation and a competitive advantage.
The optiSLang module, statistics on structure (SoS), provides a variety of powerful statistical functions with field quantities like geometry deviation or stresses on 1D, 2D or 3D discretization levels that can then be further explored including:
- Encapsulating (lower and upper) value limits
- Mean, standard deviation, variance
- Linear correlations and coefficient of divergence (CoD) with respect to input parameters
- Nonlinear sensitivity indices with respect to input parameters
- Quantile values, k*sigma values and exceedance probabilities for fixed limit values
- Cp, Cpk statistics
- Standard error for mean value and variance
- Eigenvalue decomposition of variability of field data using random field decomposition
Different visualization modes and various configuration options round off post-processing. You can choose between different representations. The interior of 3D structures can also be made visible with semitransparent isosurfaces or through sectional planes. SoS data evaluation and processing is scriptable and can be included in optiSLang workflows for post-processing purposes or to automatically generate design variations based on random field parametric in any CAE solver.
Ansys optiSLang builds metamodels based on simulation or test results. Essentially, a metamodel is a surrogate that learns how response variability is connected to input variability from an underlying set of high-fidelity simulations or real-system data, but with rapid feedback. Running a simulation to predict a certain design configuration may take hours or days but running the reduced-order model will provide the answer in a fraction of a second. These metamodels can be used inside optiSLang for optimization or robust design analysis or exported for use as a ROM.
Ansys optiSLang metamodeling performs three important tasks. First, it determines the relevant parameter subspace, potentially reducing the dimensionality of the problem and the number of design points required to build accurate enough metamodels. Next, it develops the metamodel of optimal prognosis (MOP) by identifying the optimal metamodel in the optimal subspace of a parameter resulting in the best possible forecast quality of response variability. The forecast quality is measured on a set of data points using the coefficient of prognosis (CoP) quality with cross-validation. Prediction accuracy is critical to the value of ROMs and the rigorous cross-validation does not allow for overfitting of data.
Summarizing, MOP affords
- Identification of the most important input variables related to each response value
- Automated identification of the metamodel with the best prognosis quality
- Quantification of objective forecast quality of each response
- Minimization of real solver runs by using MOP as a surrogate
The surrogates can also be used as a ROM. ROMs built on data from simulation or the field are considered data-based ROMs. In contrast to ROMs generated with linearization strategies, data-based ROMs have no limitations in representing nonlinearities. Of course, if not enough data points are available to represent the nonlinearity, the ROM’s ability to forecast the quality to represent the response variability will be low, but by having an objective forecast quality measure in place (CoP), users are not at risk in continuing with overfitted or unreliable data-based ROMs. On the other hand, the forecast quality, measured with global and local CoP values, is related to the variation space represented in the underlying data set. If that variation space changes or if the metamodel is being used outside the variation range sufficiently represented by the underlying data, we recommend resampling the variation space and regenerating the ROM.
ROMs are of great importance for system simulation or for digital twins. If detailed high-fidelity product simulation must be linked to sensor data for an accurate prediction of characteristic values (e.g., the service life of turbine blades), data-based ROMs are the key technology to bridge hig optimize the maintenance or operation of a system. The requirements for the response time of digital twins can only be met if the high-fidelity simulation models are accelerated by orders of magnitudes.
The random field decomposition is the key technology to extend metamodeling from scalar values to time series (signal MOP) and field data (field MOP). It opens new horizons of metamodeling to be used as ROMs in virtual prototyping, in system simulation and for digital twin applications.
Design Optimization and Parameter Identification (Calibration)
Ansys optiSLang provides powerful optimization algorithms and automated workflows for efficient determination of optimal design parameters around various multidisciplinary, nonlinear and multicriteria optimization tasks. Starting from the earlier steps of sensitivity analysis, optiSLang is already aware of the most relevant parameters and can perform a pre-optimization on the metamodel to target optimal design configurations. The wizard-driven module for optimization provides user guidance (a decision tree) to recommend the optimizer with default setting
Available algorithms include:
- Gradient-based methods (NLPQL)
- Nature-inspired optimization algorithms (NOA)
- Adaptive response surface method (ARSM)
- Adaptive metamodel of optimal prognosis (AMOP
- Customized optimization algorithms
Efficiently running a pre-optimization of parameter sets and studying possible objective conflicts using the metamodels from sensitivity analysis will help you to arrive at a good initial design plus guidance to find valid formulation of your optimization task. Easy definition of parameter ranges, single and multiple objectives and constraints help you to set up all studies. For even more efficient and standardized work, workflow management in optiSLang allows you to forward optimal design candidates from one optimization loop to the next. It also allows you to nest investigation loops (e.g., performance maps) or combine different disciplines.
The software’s sensitivity analysis, together with its optimization algorithm, is often used to calibrate simulation models to their “real-world twin”. Therefore, measurement data represent characteristic system responses that are critical to validate and to improve the physical model of the system. In the context of parameter identification, model calibration means using experimental observations and simulation runs to identify unknown or uncertain simulation model parameters.
By means of sensitivity analyses, parameters that have an influence on the simulation results and the calibration procedure must first be detected. It is important to reduce the set of parameters to those that have significant influence on the result variability, and deactivate others –ithout significant sensitivity they cannot be identified and any optimization would result in meaningless values for insignificant parameters. Next, sensitivity analysis helps to define relevant parameter ranges and suitable measures to quantify the difference between measurement and simulation. Finally, you can analyze whether the inverse problem will be solved non-ambiguously, which means a unique parameter combination exists that allows unique matching between measurement and simulation. An automated model calibration efficiently identifies relevant or non-measurable parameters to achieve the best possible match between simulation results and test data.
Robust Design & Reliability
Manufacturing tolerances, material scatter, randomly distributed loads or other stochastic effects cause scattering properties of product performance. Signal data, such as load curves or data from measurement protocols (e.g., load strain curves, frequency responses) are also often subject to scattering or uncertainty. To ensure your product quality, avoid product recalls or meet safety requirements, you can take these variations into account in the virtual prototyping process by applying stochastic analysis and statistical methods.
Furthermore, optimized designs are often pushed to their performance boundaries, e.g., regarding material strength. It is therefore necessary to investigate the impact of scattering input variables on these designs, e.g., geometry, material parameters, boundary conditions or loads. In order to cope with the unavoidable uncertainties in operating conditions as well as in manufacturing processes, it is essential to introduce an appropriate robustness evaluation and measurement based on stochastic analysis. Our consulting expertise yields that the variance coefficient is also a suitable robustness measure comparing the relative variations of the critical model responses to the relative variation of the input variable, in addition to standard variation and sigma levels.
Robustness evaluation similar to sensitivity analysis also identifies the most important scattering variables and provides a decision tree to help you select the most appropriate algorithms to verify robustness or reliability for your specific situation. Ansys optiSLang quantifies the robustness of designs by generating a set of suitable design variations based on scattering input variables. Optimized Latin hypercube sampling and the quantification of every input variability on the result variation by the coefficient of prognosis (CoP) ensures the reliability of the variation and correlation measures with a minimum of required design variants.
If designs need to meet high safety or quality requirements with low event probabilities of less than 1 out of 1000, a reliability analysis is necessary to investigate how these designs are affected by scattering input variables, e.g., geometry, material parameters, boundary conditions or loads. As an alternative to the estimation of safety distances by using standard deviations in robustness evaluations, a reliability analysis calculates the probability of exceeding a certain limit by using stochastic algorithms. As a result, rare event violations can be quantified and proven to be less than the accepted value. For reliability analysis, optiSLang provides powerful numerical algorithms for the determination of small event violation probabilities.
Quality is one of the most important product properties. Providing it in an optimal manner results in reduced costs for rework, scrap, recall or even legal actions, and satisfies costumers demand for reliability. During product development, the common approach to achieve this goal is to combine optimization and robustness/reliability evaluation in what is known as robust design optimization (RDO) or uncertainty quantification (UQ). The method uses the results of stochastic analysis as constraints or objectives to accomplish the optimization. By combining workflows of optimization and robustness/reliability, optiSLang allows you to appropriately consider uncertainties in the optimization process and to perform the essential final verification of critical scenarios.