OptiY
   HOME

TheInfoList



OR:

OptiY is a design environment
software Software is a set of computer programs and associated software documentation, documentation and data (computing), data. This is in contrast to Computer hardware, hardware, from which the system is built and which actually performs the work. ...
that provides modern optimization strategies and state of the art probabilistic algorithms for uncertainty, reliability, robustness, sensitivity analysis, data-mining and meta-modeling.


Features

OptiY is an open-source,
multidisciplinary Interdisciplinarity or interdisciplinary studies involves the combination of multiple academic disciplines into one activity (e.g., a research project). It draws knowledge from several other fields like sociology, anthropology, psychology, ec ...
design environment, which provides direct and generic interfaces to many CAD/CAE-systems and house-intern codes. Furthermore, a complex COM-interface and a user-node with predefined template are available so that user can self-integrate extern programs for ease of use. The insertion of any system to an arbitrary process chain is very easy using the graphical workflow editor. Collaborating different simulation model classes is possible as networks, finite-element-method, multi-body-system, material test bench etc.


Data mining

Data mining is the process of extracting hidden patterns from data. Data mining identifies trends within data that go beyond simple data analysis. Through the use of sophisticated algorithms, non-statistician users have the opportunity to identify key attributes of processes and target opportunities. Data mining is becoming an increasingly important tool to transform this data into information. It is commonly used in a wide range of applications such as manufacturing, marketing, fraud detection and scientific discovery etc.


Sensitivity analysis

Local Sensitivity as correlation coefficients and partial derivatives can only used, if the correlation between input and output is linear. If the correlation is nonlinear, the global sensitivity analysis has to be used based on the variance-relationship between input- and output-distribution as Sobol index. With
sensitivity analysis Sensitivity analysis is the study of how the uncertainty in the output of a mathematical model or system (numerical or otherwise) can be divided and allocated to different sources of uncertainty in its inputs. A related practice is uncertainty anal ...
, the system complexity can be reduced and the cause-and-effect chain can be explained.


Probabilistic simulation

The variability, uncertainty, tolerance and error of the technical systems play an important part by the product design process. These cause by manufacturing inaccuracy, process uncertainty, environment influences, abrasion and human factors etc. They are characterized by a stochastic distribution. The deterministic simulation cannot predict the real system behaviors due to the input variability and uncertainty, because one model calculation shows only one point in the design space. Probabilistic simulation has to be performed. Thereby, the output distributions will be calculated from input distributions based on the deterministic simulation model by any simulation system. The realistic system behaviors can be derivate from these output distributions.


Reliability analysis

The variability of parameters causes often a failure of the system. Reliability analysis (
Failure mode and effects analysis Failure mode and effects analysis (FMEA; often written with "failure modes" in plural) is the process of reviewing as many components, assemblies, and subsystems as possible to identify potential failure modes in a system and their causes and effe ...
) investigates the boundary violation of output due to input variability. The failure mechanisms of components are known in the specification for the product development. They are identified by measurement, field data collection, material data, customer-specifications etc. In the simulation, the satisfaction of all product specifications is defined as constraints of the simulation results. The system reliability is given, if all constraints scatter insight the defined boundaries. Although a nominal parameter simulation shows that all values of the constraints are located in reliable boundaries, the system reliability however cannot be warranted due to input variability. A part of the constraints variability, which violates the defined boundaries, is called the failure probability of the solution. Reliability analysis computes the failure probability of the single components and also of the total system at a given time point.


Meta-modeling

Meta-modeling or Surrogate model is a process to win the mathematical relationship between design parameters and product characteristics. For each point in the parameter space, there is a corresponding point of the design space. Many model calculations should be performed to show the relationship between input and output systematically (Full Factorial Design). For a high computing effort of the product model, it is practically infeasible. Adaptive response surface methodology can be used to solve this problem.


Fatigue life prediction

Predicting
fatigue (material) In materials science, fatigue is the initiation and propagation of cracks in a material due to cyclic loading. Once a fatigue crack has initiated, it grows a small amount with each loading cycle, typically producing striations on some parts o ...
has been one of the most important problems in design engineering for reliability and quality. They have several practical uses: rapid design optimization during development phase of a product and predicting field use limits as well as failure analysis of product returned from the field or failed in qualification test. Fatigue analysis focus on the thermal and mechanical failure mechanism. Most fatigue failure can be attributed to thermo-mechanical stresses caused by differences in the coefficient of thermal and mechanical expansion. The fatigue failures will occur when the component experiences cyclic stresses and strains that produce permanent damage.


Multi-objective optimization

In development process of technical products, there are frequently design problems with many evaluation goals or criteria as low cost, high quality, low noise etc. Design parameters have to be found to minimize all criteria. In contrast to a single optimization, there is another order structure between parameter and criteria spaces at a
multi-objective Optimization Multi-objective optimization (also known as multi-objective programming, vector optimization, multicriteria optimization, multiattribute optimization or Pareto optimization) is an area of multiple criteria decision making that is concerned with ...
. Criteria conflict each other. Trying to minimize a criterion, other criteria may be maximized. There is not only one solution, but also a
Pareto optimal Pareto efficiency or Pareto optimality is a situation where no action or allocation is available that makes one individual better off without making another worse off. The concept is named after Vilfredo Pareto (1848–1923), Italian civil engin ...
solution frontier. Multi-objective optimization finds all Pareto solutions automatically with a single run. The multiple decision making support tool is also available to select one best suitable solution from them.


Robust design optimization

Variability, uncertainty and tolerance have to be considered for design process of technical systems to assure the highly required quality and reliability. They are uncontrollable, unpredictable and cause the uncertainty satisfaction of the required product specifications. The design goal is assuring of the specified product functionalities in spite of unavoidable variability and uncertainty. The approach solving this problem is robust design of the product parameters in the early design process ( Robust Parameter Design (RPD)). Thereby, optimal product parameters should be found. Within, the system behavior is robust and insensitive in spite of unavoidable variability. E.g. the consistent variability and uncertainty leads only to the smallest variability of the product characteristics. So, the required product specifications will be always satisfied.Sung H. Park: ''Robust design and analysis for quality engineering''. Chapman & Hall 1996.


References


External links

*
OptiY University

Analysis and Model Based Optimization of an Electromagnetic Valve Actuator

Probabilistic Optimization of Polarized Magnetic Actuators by Coupling of Network and Finite Element Models

Robust Design and Optimization of Thick Film Accelerometers in COMSOL Multiphysics with OptiY

Robust Design Optimization with OptiY

Meta-Modeling With OptiY - Winning Mathematical Surrogate Models from Measurement Data or Complex Finite Element Analysis

Sensitivity Study, Design Optimization and Tolerance Analysis of a Car Suspension in RecurDyn
{{DEFAULTSORT:Optiy Computer system optimization software Computer-aided design software Quality