Driving the Design of Small Modular Nuclear Reactors with Flownex

Flownex-Logo-2015-250wThe development of small modular nuclear reactors, or SMR’s, is a complex task that involves balancing the thermodynamic performance of the entire system. Flownex is the ideal tool for modeling  pressure drop [flow] and heat transfer [temperature] for the connected components of a complete system in steady state and transient, sizing and optimizing pumps or compressors, pipes, valves, tanks, and heat exchangers.

To highlight this power and capability,  PADT and Flownex will be exhibiting at the 2016 SMR conference in Atlanta where we will be available to discuss exciting new Flownex developments in system and subsystem simulations of SMRs.  If you are attending this year’s event, please stop by the Flownex booth and say hello to experts from M-Tech and PADT.

If you are not able to make the conference or if you want to know more now, you can view more information from the new Flownex SMR brochure or this video:

Why is Flownex a Great Tool for SMR Design and Simulation?

These developments offer greatly reduced times for performing typical design tasks required for Small Modular Nuclear Reactor (SMR) projects including sizing of major components, calculating overall plant efficiency, and design for controllability

This task involves typical components like the reactor primary loop, intermediate loops, heat exchangers or steam generators and the power generation cycle. Flownex provides for various reactor fuel geometries, various reactor coolant types and various types of power cycles.

Flownex can also be used for determining plant control philosophy. By using a plant simulation model, users can determine the transient response of sensed parameters to changes in input parameters and based on that, set up appropriate pairings for control loops.

For passive safety system design Flownex can be used to optimize the natural circulation loops.  The program can calculate the dynamic plant-wide temperatures and pressures in response to various accident scenarios, taking into account decay heat generation, multiple natural circulation loops, transient energy storage and rejection to ambient conditions.

flownex-smr-model-1

Learn more at www.padtinc.com/flownex, give us a call at 480.813.4884 or email brian.duncan@padtinc.com.

 

7 Reasons why ANSYS AIM Will Change the Way Simulation is Done

ANSYS-AIM-Icon1When ANSYS, Inc. released their ANSYS AIM product they didn’t just introduce a better way to do simulation, they introduced a tool that will change the way we all do simulation.  A bold statement, but after PADT has used the tool here, and worked with customers who are using it, we feel confident that this is a software package will drive that level of change.   It enables the type of change that will drive down schedule time and cost for product development, and allow companies to use simulation more effectively to drive their product development towards better performance and robustness.

It’s Time for a Productivity Increase

AIM-7-old-modelIf you have been doing simulation as long as I have (29 years for me) you have heard it before. And sometimes it was true.  GUI’s on solvers was the first big change I saw. Then came robust 3D tetrahedral meshing, which we coasted on for a while until fully associative and parametric CAD connections made another giant step forward in productivity and simulation accuracy. Then more recently, robust CFD meshing of dirty geometry. And of course HPC improvements on the solver side.

That was then.  Right now everyone is happily working away in their tool of choice, simulating their physics of choice.  ANSYS Mechanical for structural, ANSYS Fluent for fluids, and maybe ANSYS HFSS for electromagnetics. Insert your tool of choice, it doesn’t really matter. They are all best-in-breed advanced tools for doing a certain type of physical simulation.  Most users are actually pretty happy. But if you talk to their managers or methods engineers, you find less happiness. Why? They want more engineers to have access to these great tools and they also want people to be working together more with less specialization.

Putting it all Together in One Place

AIM-7-valve2-multiphysicsANSYS AIM is, among many other things, an answer to this need.  Instead of one new way of doing something or a new breakthrough feature, it is more of a product that puts everything together to deliver a step change in productivity. It is built on top of these same world class best-in-bread solvers. But from the ground up it is an environment that enables productivity, processes, ease-of-use, collaboration, and automation. All in one tool, with one interface.

Changing the Way Simulation is Done

Before we list where we see things changing, let’s repeat that list of what AIM brings to the table, because those key deliverables in the software are what are driving the change:

  • IAIM-7-pipe-setupmproved Productivity
  • Standardized Processes
  • True Ease-of-Use
  • Inherent Collaboration
  • Intuitive Automation
  • Single Interface

Each of these on their own would be good, but together, they allow a fundamental shift in how a simulation tool can be used. And here are the seven way we predict you will be doing things differently.

1) Standardized processes across an organization

The workflow in ANSYS AIM is process oriented from the beginning, which is a key step in standardizing processes.  This is amplified by tools that allow users, not just programmers, to create templates, capturing the preferred steps for a given type of simulation.  Others have tried this in the past, but the workflows were either too rigid or not able to capture complex simulations.  This experience was used to make sure the same thing does not happen in ANSYS AIM.

2) No more “good enough” simulation done by Design Engineers

Ease of use and training issue has kept robust simulation tools out of the hands of design engineers.  Programs for that group of users have usually been so watered down or lack so much functionality, that they simply deliver a quick answer. The math is the same, but it is not as detailed or accurate.  ANSYS AIM solves this by give the design engineer a tool they can pick up and use, but that also gives them access to the most capable solvers on the market.

3) Multiphysics by one user

Multiphysics simulation often involves the use of multiple simulation tools.  Say a CFD Solver and a Thermal Solver. The problem is that very few users have the time to learn two or more tools, and to learn how to hook them together. So some Multiphysics is done with several experts working together, some in tools that do multiple physics, but none well, or by a rare expert that has multi-tool expertise.  Because ANSYS AIM is a Multiphysics tool from the ground up, built on high-power physics solvers, the limitations go away and almost any engineer can now do Multiphysics simulation.

AIM-7-study4) True collaboration

The issues discussed above about Multiphysics requiring multiple users in most tools, also inhibit true collaboration. Using one user’s model in one tool is difficult when another user has another tool. Collaboration is difficult when so much is different in processes as well.  The workflow-driven approach in ANSYS AIM lends itself to collaboration, and the consistent look-and-feel makes it happen.

5) Enables use when you need it

This is a huge one.  Many engineers do not use simulation tools because they are occasional users.  They feel that the time required to re-familiarize themselves with their tools is longer than it takes to do the simulation. The combination of features unique to ANSYS AIM deal with this in an effective manner, making accurate simulation something a user can pick up when they need it, use it to drive their design, and move on to the next task.

6) Stepping away from CAD embedded Simulation

The growth of CAD embedded simulation tools, programs that are built into a CAD product, has been driven by the need to tightly integrate with geometry and provide ease of use for the users who only occasionally need to do simulation. Although the geometry integration was solved years ago, the ease-of-use and process control needed is only now becoming available in a dedicated simulation tool with ANSYS AIM.

7) A Return to home-grown automation for simulation

AIM-7-scriptIf you have been doing simulation since the 80’s like I have, you probably remember a day when every company had scripts and tools they used to automate their simulation process. They were extremely powerful and delivered huge productivity gains. But as tools got more powerful and user interfaces became more mature, the ability to create your own automation tools faded.  You needed to be a programmer. ANSYS AIM brings this back with recording and scripting for every feature in the tool, with a common and easy to use language, Python.

How does this Impact Me and or my Company?

It is kind of fun to play prognosticator and try and figure out how a revolutionary advance in our industry is going to impact that industry. But in the end it really does not matter unless the changes improve the product development process. We feel pretty strongly that it does.  Because of the changes in how simulation is done, brought about by ANSYS AIM, we feel that more companies will use simulation to drive their product development, more users within a company will have access to those tools, and the impact of simulation will be greater.

AIM-f1_car_pressure_ui

To fully grasp the impact you need to step back and ponder why you do simulation.  The fast cars and crazy parties are just gravy. The core reason is to quickly and effectively test your designs.  By using virtual testing, you can explore how your product behaves early in the design process and answer those questions that always come up.  The sooner, faster, and more accurately you answer those questions, the lower the cost of your product development and the better your final product.

Along comes a product like ANSYS AIM.  It is designed by the largest simulation software company in the world to give the users of today and tomorrow access to the power they need. It enables that “sooner, faster, and more accurately” by allowing us to change, for the better, the way we do virtual testing.

The best way to see this for yourself is to explore ANSYS AIM.  Sign up for our AIM Resource Kit here or contact us and we will be more than happy to show it to you.

AIM_City_CFD

Free ANSYS AIM Resource Kit — Expert Advice, Insights and Best Practices for Multiphysics Simulation

ANSYS-AIM-Icon1We have been talking a lot about ANSYS AIM lately.  Mostly because we really like ANSYS AIM and we think a large number of engineers out there need to know more about it and understand it’s advantages.  And the way we do that is through blog posts, emails, seminars, and training sessions.  A new tool that we have started using are “Resource and Productivity Kits,” collections of information that users can download.

Earlier in the year we introduced several kits, including ANSYS Structural, ANSYS Fluids, and ANSYS ElectroMechanical.  Now we are pleased to offer up a collection of useful information on ANSYS AIM.  This kit includes:

  • “Getting to know ANSYS AIM,” a video by PADT application engineer Manoj Mahendran
  • “What I like about ANSYS AIM,” a video featuring insights on the tool
  • Six ANSYS AIM demonstration videos, including simulations and a custom template demonstration
  • Five slide decks that provide an overview of ANSYS AIM and describe its new features
  • An exclusive whitepaper on effectively training product development engineers in simulation.

You can download the kit here.

If you need more info, view the ANSYS AIM Overview video or read about it on our ANSYS AIM page.

Watch this blog for more useful content on AIM in the future.


AIM_City_CFD

Presentation: Leveraging Simulation for Product Development of IoT Devices

SEMI-AZ-IOT-4

SEMI-AZ-IOT-5
Yours truly going over the impact of Simulation on IoT Product Development

The local SEMI chapter here in Arizona held a breakfast meeting on Monetizing Internet of Things (IoT) and PADT was pleased to be one of the presenters. Always a smart group, this was a chance to sit with people making the sensors, chips, and software that enable the IoT and dig deep in to where things are and where they need to be.

The event was hosted by one of our favorite customers, and neighbor right across the street, Freescale Semiconductor.  Speakers included IoT experts from Freescale, Intel, Medtronics, ASU, and SEMICO Research.

Not surprisingly I talked about how Simulation can play a successful role in product development of IoT devices.

You can download a copy of the presentation here: PADT-SEMI-IOT-Simulation-1.pdf

UPDATE (11/9/2015): Great write-up by Don Dingee on this event in the SemiWiki. Click here to read it. It includes a great summary of the other speakers.

You can also see more details on how people use Simulation for this application on the ANSYS, Inc. website here.  We also like this video from ANSYS that shows some great applications and how ANSYS is used with them:

A couple of common themes resonated across the speakers:

  1. Price and size need to come down on the chips used in IoT (this was a semiconductor group, so this is a big part of their focus)
  2. Lowering power usage and increasing power density in batteries is a key driver
  3. The biggest issue in IoT is privacy and security. Keeping your data private and keeping people from hacking in to IoT devices.
  4. Another big problem is dealing with all the data collected by IoT devices. How to make it useful and how to store it all.  One answer is reducing the data on the device, another is only keeping track of what changes.
  5. It is early, standards are needed but they are still forming.

If you look at this list, the first two problems are addressable with simulation:

SEMI-AZ-IoT-2

PADT has a growing amount of experience with helping customers simulate and design IoT devices as well as the chips, sensors, and antenna that go in to IoT devices.  To learn more, shoot us an email at info@padtinc.com or call 480.813.4884.

 

Free Training and Evaluation for ANSYS AIM

AIM_City_CFDPADT is hosting a series of free training classes to introduce users to ANSYS AIM.  We have pasted the invitation below.  You can register here.  We are very excited about this new tool from ANSYS, Inc. and are eager to share it with everyone. Look for more AIM information on this blog in the near future.

Free Training and Evaluation for ANSYS® AIM™.
Register Today – Seats Are Limited.

Discover how to design your next product
better… and faster

aim-2

ANSYS AIM: Integrated Multiphysics Simulation Environment
for All Engineers

aim-3

Free Training and Evaluation for ANSYS® AIM™ – An Integrated Multi-physics Simulation Environment for All Engineers

As a special offer, PADT Inc. is offering FREE “Jump Start” training and hands-on evaluation for ANSYS® AIM™. Design engineers, method engineers and managers seeking to learn the latest simulation software, boost adoption and usability for the occasional user, or extend their existing CAD-based tool’s limited functionality will benefit from this no-obligation course.

Register Today – Seats are limited and will be filled on a first-come, first-served basis. On completion of the class, you’ll be qualified to receive and use a FREE 30-day ANSYS AIM download for evaluation.

All classes will be held from 9:00 a.m. – 4:00 p.m. local time and include a complimentary lunch.

PADT’s support team of ANSYS experts will help attendees understand where ANSYS AIM fits in to their organization and workflow. The class will address both situations and how ANSYS AIM provides the integration of CAD based systems and the ease of use of a modern tool in a product that steps the occasional user through the process without limiting functionality.

Watch this short video to learn more about the capabilities and benefits of ANSYS® AIM™ for the simulation of 3-D physics and multiphysics

Contact our ANSYS experts 1-800-293-PADT, info@padtinc.com

CoresOnDemand: Helping Engineers Do Their Magic

CoresOnDemand-Logo-120hEngineers Do Magic

In the world of simulation there are two facts of life. First, the deadline of “yesterday would be good” is not too uncommon. Funding deadlines, product roll-out dates, as well as unexpected project requirements are all reliable sources for last minute changes. Engineers are required to do quality work and deliver reliable results in limited time and resources. In essence perform sorcery.

af-01

Second, the size and complexity of models can vary wildly. Anything from fasteners and gaskets to complete systems or structures can be in the pipeline. Engineers can be looking at any combination of hundreds of variables that impact the resources required for a successful simulation.

Required CPU cores, RAM per core, interconnect speeds, available disk space, operating system and ANSYS version all vary depending on the model files, simulation type, size, run-time and target date for the results.

Engineers usually do magic. But sometimes limited time or resources that are out of reach can delay on-time delivery of project tasks.

At PADT, We Can Help

PADT Inc. has been nostrils deep in engineering services and simulation products for over 20 years. We know engineering, we know how to simulate engineering and we know ANSYS very well. To address the challenges our customers are facing, in 2015 PADT introduced CoresOnDemand to the engineering community.

af-02

CoresOnDemand offers the combination of our proven CUBE cluster, ANSYS simulation tools and the PADT experience and support as an on demand simulation resource. By focusing on the specific needs of ANSYS users, CoresOnDemand was built to deliver performance and flexibility for the full range of applications. Specifics about the clusters and their configurations can be found at CoresOnDemand.com.

CoresOnDemand is a high performance computing environment purpose built to help customers address numerical simulation needs that require compute power that isn’t available or that is needed on a temporary basis.

Call Us We’re Nice

CoresOnDemand is a new service in the world of on-demand computing. Prospective customers just need to give us a call or send us an inquiry here to get all of their questions answered. The engineers behind CoresOnDemand have a deep understanding of the ANSYS tools and distributed computing and are able to asses and properly size a compute environment that matches the needed resources.

Call us we’re nice!

Two Halves of the Nutshell

The process for executing a lease on a CoresOnDemand cluster is quite straight forward. There are two parts to a lease:

PART 1: How many cores & how long is the lease for?

By working with the PADT engineers – and possibly benchmarking their models – customers can set a realistic estimate on how many cores are required and how long their models need to run on the CoresOnDemand clusters. Normally, leases are in one-week blocks with incentives for longer or regular lease requirements.

Clusters are leased in one-week blocks, but we’re flexible.

Part 2: How will ANSYS be licensed?

An ANSYS license is required in order to run on the CoresOnDemand environment.  A license lease can be generated by contacting any ANSYS channel partner. PADT can generate license leases in Arizona, Colorado, New Mexico, Utah & Nevada. Licenses can also be borrowed from the customer’s existing license pool.

An ANSYS license may be leased from an ANSYS channel partner or borrowed from customer’s existing license pool.

Using the Cluster

Once the CoresOnDemand team has completed the cluster setup and user creation (takes a couple of hours for most cases), customers can login and begin using the cluster. The CoresOnDemand clusters allow customers to use the connection method they are comfortable with. All connections to CoresOnDemand are encrypted and are protected by a firewall and an isolated network environment.

Step 1: Transfer files to the cluster:

Files can be transferred to the cluster using Secure Copy Protocol which creates an encrypted tunnel for copying files. A graphical tool is also available for Windows users (& it’s freeJ). Also, larger files can be loaded to the cluster manually by sending a DVD, Blu-ray disk or external storage device to PADT. The CoresOnDemand team will mount the volume and can assist in the copying of data.

Step 2: Connect to the cluster and start jobs

Customers can connect to the cluster through an SSH connection. This is the most basic interface where users can launch interactive or batch processing jobs on the cluster. SSH is secure, fast and very stable. The downside of SSH is that is has limited graphical capabilities.

Another option is to use the Nice Software Desktop Cloud Visualization (DCV) interface. DCV provides enhanced interactive 2D/3D access over a standard network. It enables users to access the cluster from anywhere on virtually any device with a screen and an internet connection. The main advantage of DCV is the ability to start interactive ANSYS jobs and monitor them without the need for a continuous connection. For example, a user can connect from his laptop to launch the job and later use his iPad to monitor the progress.

af-04

Figure 1. 12 Million cell model simulated on CoresOnDemand

The CoresOnDemand environment also has the Torque resource manager implemented where customers can submit multiple jobs to a job queue and run them in sequence without any manual intervention.

Customers can use SCP or ship external storage to get data on the cluster. SSH or DCV can be used to access the cluster. Batch, interactive or Torque scheduler can be used to submit and monitor jobs.

All Done?

Once the simulation runs are completed customers usually choose one of two methods to transfer data back. First is to download the results over the internet using SCP (mentioned earlier) or have external media shipped back (External media can be encrypted if needed).

After the customer receives the data and confirms that all useful data was recovered from the cluster, CoresOnDemand engineers re-image the cluster to remove all user data, user accounts and logs. This marks the end of the lease engagement and customers can rest assured that CoresOnDemand is available to help…and it’s pretty fast too.

At the end of the lease customers can download their data or have it shipped on external media. The cluster is later re-imaged and all user data, accounts & logs are also deleted in preparation for the next customer.

CoresOnDemand-Advert-Rect-360w

ANSYS Launches Free Student Version

ansys-student-1This week ANSYS, Inc. made a fantastic announcement that has been in the works for a while, and that we think will greatly benefit the simulation community:  A free ANSYS Student product.  This is an introductory product that is focused on students who are learning the fundamentals of simulation who also want to learn the full power and capability of the ANSYS product suite.  It includes ANSYS® Multiphysics™ , ANSYS® CFD™ , ANSYS® Autodyn®, ANSYS® Workbench™, ANSYS® DesignModeler™and ANSYS®DesignXplorer™

Yes you read that right, all of the flagship products for free. No features or capabilities are turned off. It is the exact same software as the commercial product, but the size of problems that you can solve is limited.  It runs on MS Windows. Perfect for students.

PADT is excited about this because it gives students access to the ability to learn FEA and CFD simulation with the world’s most popular and capable simulation tool, without running in to brick walls. Want to do a flat plate with a hole in it? No Problem. Want to model fluid-solid-interaction on a flexible membrane valve? No Problem.  Want to model explosive forming? No Problem.  Want to model combustion with complex turbulence? No problem.

All in the same interface as students will use when they enter the work force or do research at University.

This is great news and we can’t wait to see what schools and students do with this access.

How to Get It – The New Academic Web Pages

The previous Student Portal is being replaced with an Academic Web area on the ansys.com site: ansys.com/academic.

Go to the ANSYS Student site to learn more about ANSYS Student and how to download your copy. These same pages will have resources to help you learn and understand the product.

The “Pictures”

Let me state categorically that PADT was not consulted on the image that ANSYS, Inc. used for the “student” user that was so happy to find out that there is now a free version of the ANSYS software suite.  Here is their picture:

ANSYS-student-version We would have preferred something like this:

huge.1.7907

 

Just kidding. We were happy to see this product come out and thought the picture was hilarious.  In all seriousness, we will also plug the  recent #ilooklikeanengineer twitter hash tag , highlighting the diversity of female engineers. that was awesome and we would love to see more chances for engineers to show their true selves.

 

Tech Tips and Videos for Electromechanical Simulation with ANSYS Products

ansys_free_techtipsWe just recieved a new tech tip bundle from ANSYS, Inc on Electromechanical Simulation.  You may remember when we published the Mechanical and Fluids ANSYS tech tips a few weeks ago.  This latest kit continues with information for people making devices and systems that have mechanical and electrical systems.  The focus of the kit is the application of ANSYS Maxwell and Simplorer – Maxwell to model low frequency electromagnetics and Simplorer to model systems.

Here is a link to “The Electromechanical Simulation Productivity Kit ” here. The kit includes:

  • ANSYS Maxwell Automation and Customization Application Brief
  • ANSYS Maxwell Magnetic Field Formulation Application Brief
  • Electric Machine Design Methodology Whitepaper
  • Electromagnetics And Thermal Multiphysics Analysis Webinar
  • Rechargeable Lithium Ion Battery Whitepaper
  • Robust Electric Machine Design – ANSYS Advantage Article

We also have a collection of videos that are a great introduction to the tool set and how to use it. Check out the overview and the video on the washing machine at a minimum.  Even if you have a simple EMAG or do hand calcs, you need to look at Maxwell and Simplorer.

Seminar Info: Designing and Simulating Products for 3D Printing

Note: We have scheduled an encore Lunch & Learn and companion Webinar for March 23, 2015.  Please register here to attend in person at CEI in Phoenix or here to attend via the web.

ds43dp-1People are interested in how to better do design and simulation for products they manufacture using 3D Printing.  When the AZ Tech council let us know they had a cancelation for their monthly manufacturing Lunch and Learn, we figured why not do something on this topic, a few people might show up. We had over 105 people register, so we had to close registration. In the end around 95 total people made it to the seminar, which is more than expected so we had to add chairs. Who would have thought that many people would come for such a nerdy topic?.

For an hour and fifteen minutes they sat and listned to us talk about the ins and outs of using this growing technology to make end use parts.  Here is a copy of the PowerPoint as a PDF.

We did add one bullet item in the design suggestions area based on a question. Someone pointed out that the machine instructions, what the AM machine uses to make the parts, should be a controlled document. They are exactly right and that is a very important process that needs to be put in place to get traceability and repeatability.  

Here are some useful links:

As always, do not hesitate to contact us for more information or with any questions.

If you missed this presentation, don't worry, we are looking to schedule a live/web version of this talk with some enhancements sometime in March.  Watch the usual channels for time, place, and registration information. We will also be publishing detailed blog posts on many of the topics covered today, diving deeper into areas of interest.

Thank you to the AZ Tech Council, ASU SkySong, and everyone that attended for making this our best attended non-web seminar ever.

Design and Simulation for 3D Printing Full House

Deflategate Update: ANSYS Simulation Shows it Really Does not Make a Difference.

There is still more debate going on about the deflated footballs that the New England Patriots used in their playoff game. "Who Deflated Them? When? Were they acting on orders?"  But no one is asking if it makes a real difference.

Enter ANSYS simulation software. Using the newest ANSYS product, ANSYS AIM, the engineers at ANSYS, Inc. were able to simulate the effect of lower pressure on grip. It turns out that the the difference in pressure only made a 5mm difference in grip. No big deal.  

Being a Multiphysics tool they were able to quickly also run a flow analysis and see what impact drag from "wobble" had on a pass.  A 10% off axis wobble resulted in 20% more drag, that is a few yards on a long pass.  Their conclusion, throwing a tight spiral is more important than the pressure of the ball.

Check out the full article on the ANSYS blog: 

http://www.ansys-blog.com/superbowl-deflategate-scandal-debunked-using-engineering-simulation/#more-11576

Here is the video as well:

FDA Opening to Simulation Supported Verification and Validation for Medical Devices

FDA-CDRH-Medical-Devices-SimulationBringing new medical device products to market requires verification and validation (V&V) of the product’s safety and efficacy. V&V is required by the FDA as part of their submission/approval process. The overall product development process is illustrated in the chart below and phases 4 and 5 show where verification is used to prove the device meets the design inputs (requirements) and where validation is used to prove the device’s efficacy. Historically, the V&V processes have required extensive and expensive testing. However, recently, the FDA’s Center for Devices and Radiological Health (CDRH) has issued a guidance document that helps companies uses computational modeling (e.g FEA and CFD) to support the medical device submission/approval process.

FDA-Medical-Device-Design-Process-Verification-Validation
Phases and Controls of Medical Device Development Process, Including Verification and Validation
 The document called, “Reporting of Computational Modeling Studies in Medical Device Submissions”, is a draft guidance document that was issued on January 17th, 2014. The guidance document specifically addresses the use of computation in the following areas for verification and/or validation:

  1. Computational Fluid Dynamics and Mass Transport
  2. Computation Solid Mechanics
  3. Computational Electromagnetics and Optics
  4. Computational Ultrasound
  5. Computational Heat Transfer

The guidance specifically outlines what form reports need to take if a device developer is going to use simulation for V&V.  By following the guidance, a device sponsor can be assured that all the information required by the FDA is included. The FDA can also work with a consistent set of input from various applicants. 

drug-delivery-1-large
CFD Simulation of a Drug Delivery System. Used to Verify Uniform Distribution of Drug

Computational Modeling & Simulation, or what we usually call simulation, has always been an ideal tool for reducing the cost of V&V by allowing virtual testing on the computer before physical testing. This reduces the number of iterations on physical testing and avoids the discovery of design problems during testing, which is usually late in the development process and when making changes is the most expensive. But in the past, you had to still conduct the physical testing. With these new guidelines, you may now be able to submit simulation results to reduce the amount of required testing.
mm_model_stresses
Simulation to Identify Stresses and Loads on Critical Components While Manipulating a Surgical Device

Validation and verification using simulation has been part of the product development process in the aerospace industry for decades and has been very successful in increasing product performance and safety while reducing development costs.  It has proven to be a very effective tool, when applied properly.  Just as with physical testing, it is important that the virtual test be designed to verify and validate specific items in the design, and that the simulation makes the right assumptions and that the results are meaningful and accurate.

PADT is somewhat unique because we have broad experience with product development, various types of computational modeling and simulation, and the process of submission/approval with the FDA. In addition, we are ISO 13485 certified. We can provide the testing that is needed for the V&V process and employ simulation to accelerate and support that testing to help our medical device customers get their products to market faster and with less testing cost.  We can also work with customers to help them understand the proper application of simulation in their product development process while operating within their quality system.

Questions Decision Makers Should Ask About Computer Simulations

‘TRUST BUT VERIFY’

A guest posting from Jack Thornton , MINDFEED Marcomm, Sante Fe, NM

image.pngThe computerization of engineering (and everything else) has imposed new burdens on managers and executives who must make critical decisions. Where once they struggled with too little information they now struggle with too much. Until roughly three decades ago, time and money were plowed into searching for more and better information. Today, time and money disappear are plowed into making sense of myriad computer simulations.

For all but the best-organized decision makers, these opposite situations have proven equally frustrating. For nearly all of engineering history, critical decisions were based on a few pieces of seemingly credible data, a handful of measurements, and hand-drawn sketches a la Leonardo DaVinci—leavened with hands-on experience and large dollops of intuition.

Computer simulations are now everywhere in engineering. They have greatly speeded up searches for information, as well as creating it in the first place, and endlessly multiplying it. What has been lost are transparency and traceability—what was done when, by whom and why. Since transparency and traceability are vital to making sound engineering decisions in today’s intensely collaborative technical environments, decision makers and managers say this loss is a big one.

This is not some arcane, hidden war waged by experts, geeks and professors. This is about designing machinery, components, physical systems and assemblies that are globally competitive—and turn a profit doing so. The complexity of modern components, assemblies and systems has been exhaustively and repeatedly described.

Nor is this something engineers and first-line managers can afford to ignore. Given the shortages of engineering talent, relatively inexperienced engineers are constantly being handed responsibility for making key decisions.

Users of computerized simulation systems continually seek ways to answer the inevitable question, “How do we know this or that or whatever to be true?” Several expert users of finite element analysis (FEA), the basic computational toolset of engineering simulation and analysis, were interviewed for this article. Each interviewee is a licensed professional engineer (PE) and each has been recommended by a leading FEA software vendor.

For decision makers, a simulation FEA or otherwise really presents only three options:

  • Signing off on the production of a component or assembly. If it proves to be flawed, warranty claims, recalls, and perhaps much worse may result.
  • Shelving a promising new product, perhaps at the behest of fretful engineers. The investment is written off or expensed as R&D. The marketplace opportunity (amnd its revenue) may be lost forever.
  • Remanding the project to the analysts even while knowing that “paralysis by analysis” will push development costs too high or cause too big a delay in getting to market.

Since executives and other upper-echelon corporate decision makers rarely possess much understanding or FEA, let alone have time to develop it, a “trust but verify” strategy is the only reasonable approach.

The verify part is easy. FEA modelers and solvers have been well wrung-out over the past 10 to 20 years. All of the FEA software vendors will share details of their in-house tests of their commercial code, the experiences of customers doing similar work, and investigations by reviewers who are often on engineering-school faculties. The same is true for industry-specific “home grown” code.

It’s the trust part that’s so challenging, as in FEA trust depends on understanding some very complicated matters.

Analysis experts note that unless the builders of FEA models are questioned, they rarely spell out the model’s underlying assumptions. Even less frequently (and clearly) described is the reasoning behind the dozens or hundreds of choices they made that are dictated by those assumptions.

And worse, these choices are not always clarified when model builders do provide this detail—quite the opposite, in fact. When pressed for explanations, model builders may simply present the mathematical formulas they use to characterize the physics of their work.

Analysis experts are quick to point out that these equations often confuse and intimidate. Decision makers should insist on commonsense explanations and not equations. And every FEA model builder will try earnestly to explain (often at great length) the model’s implications to anyone who takes the time to look.

In the context of FEA and other simulations, “physics” means the real-world forces to be withstood by a printed circuit board, a pump, an engine mount, a turbine, an aircraft wing or engine nacelle, the energy-absorbing structure of a car, or anything else that is mechanically complex and highly stressed.

This is why transparency and traceability are so important in FEA. Analysts note that some of this is codified in the guidelines for simulation and computational analysis in the ASME / ANSI verification and validation standards. Further support comes from company best practices developed by FEA users and managers, although enforcement is rarely consistent, and voluntary industry standards whose applicability varies widely.

The transparency and traceability challenge is that building a model—again, a subset of the real world—requires dozens of assumptions about the mechanical capabilities that the object or assembly must have to meet its requirements. After these basic assumptions have been coded into the model, hundreds of follow-on choices are needed to represent the physical phenomena in the model.

Analysts urge decision makers to question the stated values and ranges of any of the model’s parameters—and in particular values and ranges that have been estimated. Decision makers are routinely urged to probe whether these parameters’ values are statistically significant, and whether those values are even needed in the model.

A survey of experts turns up numerous aspects of FEA and other computerized simulations that decision makers should probe as part of a trust-but-verify approach. Among many examples:

  • Incoming geometry—usually from solid modeling systems used by product designers— and the topologies and boundaries they have chosen.
  • The numerical values representing physical properties such as yield strengths of the chosen materials.
  • Mechanical components and assemblies. How accurately represented are the bolts and welds that hold the assemblies together?
  • The stiffness of structures.
  • The number of load steps. Is the range broad enough? Are there enough intermediate steps so nothing will be missed? How true-to-life are the load vectors?
  • The accuracy of modal analyses. Resonating harmonic frequencies—vibration—can shake things apart and lead to catastrophic failures.
  • Boundary conditions, or where the object being modeled meets “the rest of the world” in the analysis. Are the specifics of the object’s physical and mechanical requirements—the geometry—accurately represented and, again, how do we know?
  • Types of analysis, which range from small, simple linear static to large, highly complex nonlinear dynamic. Should a smaller simpler analysis have been used? Could physical measurements suffice instead of analyses?
  • In fluid dynamics, how well characterized are the flows, volumes, and turbulence? How do we know? In fluid dynamics, representations of flows, volumes, and turbulence are the numerical counterparts of the finite elements used in analyses of solids.
  • Post-processing the results, i.e., making the numerical outputs, the results of the analysis, comprehensible to non-experts.

Underlying all these are the geometric and analytical components that are found in all simulations. In FEA, this means the mesh of elements that embodies the physics of the component or assembly being modeled. Decision makers should always question the choice of elements as there are hundreds to pick from.

Some models use only a handful of elements while a few use tens of millions. Also to be questioned is the sensitivity of those elements to the forces, or loads, that push or pull on the model. A caveat: this gets deeply into the inner workings of FEA, e.g. explanations of the points or nodes where adjacent elements connect, the tallies of degrees of freedom (DOFs) represented by each pair of nodes, and the huge number of partial differential equations required.

The trust-but-verify is valuable in all of the engineering disciplines—mechanical, structural, electrical / electronic, nuclear, fluid dynamics, heat transfer, aerodynamics, noise/ vibration / harshness as well as for sensors, controls, systems, and any embedded software.

Developers of FEA and other simulation systems are working hard to simplify finding these answers or at least make trust-but-verify determinations less taxing. See Sidebar, “Software Vendors Tackle Transparency and Traceability in FEA.”

Proven approaches

A proven approach to understanding FEA models is offered by Eric Miller, co-owner of Phoenix Analysis & Design Technologies or PADT, in Tempe, Ariz. “A decision maker with some understanding of the management of the data in an FEA analysis will ask about how specific inputs affect the results. Such a decision maker will lead the model builder and analyst think more deeply about those inputs. Ultimately a more accurate simulation will be created.”

Miller offers a caveat: “This questioning should be approached as an additional set of eyes looking at the problem from the outside to determine the accuracy of results. The key is to not become adversarial and question the integrity or knowledge of the analyst.”

Jeffrey Crompton, principal of AltaSim Technologies, Columbus, Ohio, goes straight to the heart of the matter: “Let’s start out with the truth – all models are wrong until proven otherwise. Despite all the best attempts of engineers, scientists and computer code developers,” he explained, “a computational model does not give the right answer until you can categorically demonstrate its agreement with reality.”

“Categorically” is a high standard, a term with almost no wiggle room. Unfortunately, given the complexity of simulations, agreement with reality is often not easy to demonstrate. Hence the probing and questioning recommended by FEA experts and engineers.

Secondly, despite tsunamis of data cascading from one engineering department to another, a great deal of the physical world still remains imprecisely quantified. Demonstrating agreement with reality “becomes increasingly difficult,” Crompton added, “when you may not know the value of some parameters, or lack real-world measurements to compare against, or are uncertain exactly how to set up the physics of the problem.”

The challenge for decision makers uncomfortable with the results of FEA analyses is neatly summed up by Gene Mannella, vice president and FEA expert at GB Tubulars Inc. in Houston. “Without a basic understanding of what FEA is, what it can and cannot do, and how to interpret its results, one can easily make bad and costly decisions,” he points out. “FEA results are at best indicators. They were never intended to be accepted” at face value.

As Mannella, Crompton and other FEA consultants regularly remind their clients, an analysis is an approximation. It is an abstraction, a forecast, a prediction. There will always be some margin of error, some irreducible risk. This is the unsettling truth behind the gibe that “all models are bad but some are useful.” No FEA model or analysis can ever be treated as “gospel.” And this is why analysts strive ceaselessly to minimize margins of error, to make sure that every remaining risk is pointed out, and to clearly explain the ramifications.

“To be understood, FEA results must be supplemented by the professional judgment of qualified personnel,” Mannella added. His point is that decision makers relying on the results of FEA analyses should never forget that what they “see” on computer monitor, no matter how visually impressive, is an abstraction of reality. Every analysis is a small subset of one small part the real world, which is constrained by deadlines, budgets, and the boundaries of human comprehension.

Mannella’s work differs from that of most other FEA shops: it is highly specialized. GB Tubulars makes connectors for drilling and producing oil and gas in extreme environments. Its products go into oil and gas projects several miles underground and also often beneath a mile or more of seawater. Pressures are extreme, bordering on the incalculable. The risk of a blowout with massive damage to equipment and the environment is ever-present.

The analysts also stressed probing the correlation with the results of physical experiments. Tests in properly equipped laboratories by qualified experimentalists are single best way to ensure that the model actually does reflect physical reality. Which brings us to the FEA challenge of extrapolations.

Often the most relevant test data is not available because physical testing is slow and costly. The absence of relevant data makes it necessary to extrapolate among the results of similar experiments. Extrapolations can have large impacts on models, so they too should be questioned and understood.

To deal with these difficulties, Crompton and the others analysts recommend, first, managing the numbers with statistical process control (SPC) methods and, second, devising the best ways to set up the model and its analyses with design-of-experiments simulations. Both should be reviewed by decision makers—ideally with a qualified engineer looking over their shoulders.

“Our mantra in this situation is ‘start simple and gradually add complexity.’” Crompton said. “Consider starting with a [relatively simple] closed-form analytical solution. The equation’s results will help foster an understanding of how the physics and boundary conditions need to be implemented for your particular problem.” [A closed-form solution is an equation with a single variable such as stress equals force times area, as opposed to a model; even the simplest simulation and analysis models have several variables.]

Peter Barrett, principal of CAE Associates in Middlebury, Conn., noted that, “the most experienced analysts start with the simple models that can be compared to a closed-form solution or are models so simple that errors are minimized and can be safely ignored.” He commented that the two acronyms that best apply to FEA are KISS (“Keep It Simple, Stupid”) and “garbage in, garbage out,” or GIGO. In other words, probe for the unneeded complexity and bad data.

Model builders are always advised by FEA experts to start by modeling the simplest example of the problem and then build upward and outward until the model reflects all the relevant physics. Decision makers should determine whether this sensible practice was followed.

When pressed for time, “some analysts will try to skip the simple-example problem and analysis,” Barrett said. “They may claim they don’t have time” for that fundamental step, i.e., that the analyst thinks the problem is easily understood. Decision makers should insist that analysts take the extra time. The analysis always benefits from starting as simply as possible,” he continued. “Decision makers will reap the rewards of more accurate analysis, which are a driver for projects being on time and under budget.”

Ken Perry, principal at Echobio LLC, Bainbridge Island, Wash., concurred. “The first general principle of modeling is KISS. Worried decision makers should verify that KISS was applied from the very beginning,” he said. “KISS is also an optimal tool to pick apart existing models that are inflated and overburdened with unnecessary complexity,” Perry added.

A favorite quote of Perry’s comes from statistician R.W. Hamming: “The purpose of computing is insight, not numbers.” Perry elaborated: “Decision makers should guard against the all-too-human tendency to default for the more complicated explanation when we don’t understand something.  Instead, apply Occam’s razor.  Chop the model down to bite-sized chunks for questioning.” [Occam’s Razor is an axiom of logic that says in cases of uncertainty the best solution is the one requiring the fewest assumptions.]

Questioning is especially important, Perry added, “whenever the decision maker’s probing questions evoke hints of voodoo, magic or engineers shaking their head in vague, fuzzy clouds of deference to increasingly specialized disciplines.”  Each of these is a warning flag that the model or analysis has shortcomings.

Perry works in the tightly regulated field of implantable medical and cardiovascular devices. He has one such device himself, a heart valve, and has pictures to prove it on his Web site. Tellingly, Perry began his career not in FEA but as an experimentalist. He worked with interferometry to physically test advanced metal alloys.

Perry is living proof that FEA experts and experimentalists could understand one another if they tried. But often they don’t try, which is another challenge for decision makers.

The last and most cautionary words are from Barrett at CAE Associates. More than anyone else, he was concerned about the risks of inexperienced engineers making critical decisions. Such responsibility often comes with an unlooked-for promotion to a product manager’s job, for example. Unexpected increases in responsibility also can arrive with attrition, departmental shakeups, and corporate acquisitions and divestitures.

“In our introductory FEA training classes we often have engineers signed up who have no prior experience with FEA. They sign up for the intro class,” he said, “because they are expected to review results of analyses that have been outsourced and/or performed overseas.”

Barrett saw this as “very dangerous. These engineers often do not know what to look for. Without knowing how to check, they may assume that the calculations in the analysis were done correctly.  It is virtually impossible to look at a bunch of PowerPoint images of post-processed analysis results and see if the modeling was done correctly. Yet this is often the case.”

PADT Joins AMIGOS and Arizona Mining Association

PADT-Mining-Association-Memberships

After working with the mining industry in Arizona for years, we have finally gotten around to joining the two professional groups that work with the industry: Arizona Mining & Industry Get our Support, better known as AMIGOS, and the Arizona Mining Association.  We are excited about being able to contribute more to the Arizona mining community through these two groups.

Mining is still a dominant industry in the state, especially copper mining with the state providing 68% of the copper produced in the US in 2011.  PADT has supplied software and hardware to both large multinational mining companies, equipment suppliers, and small consultants.  We have also provide simulation of mining processes and airflow in mines to several companies.  With the addition of Flownex to our product and services offering, our involvement with the industry has grown even more.

But, to be honest, by far and away the coolest part of being involved with mining in all of the states we work in is when we get to go visit a mine, or get to look up close at the huge equipment our customers make.  This is some cool engineering.  We plan on doing visiting new mines and exploring more equipment as we get more involved with these groups.

If you are interested in joining either group, click here.

The Reality of Simulation Driven Product Development

A note to our regular readers: This is not a normal Focus post. No info on how to use an obscure new ANSYS command. This may be something our regular readers (the people who do simulation) might find useful to share with their management. And maybe a CEO/CTO/COO or two might stumble across it and “see the light” that we have all been working in for years.

I’ve been involved in planning or attending a couple of what we call “C” level visits in the past month or so. A “C” level visit is where we talk with the CEO, CFO, CTO, COO, or some sort of high level executive at a company.  These visits are very different than sitting in a room with a bunch of engineers showing off what ANSYS software can do, or talking about what services PADT can offer.

In the “C” level visits we are there for two reasons. The first is to understand what the high level product development needs are for the company from a business perspective.  Once we know that, we like to articulate how the products we sell or the services we offer can help the company meet those goals faster and with less effort and cost. And when simulation fits into their needs, we talk about Simulation Driven Product Development (SDPD).

Many people in the simulation software business talk about SDPD a lot.  They use SDPD as buzz word and they surround it with buzz words: time to market, rapid product development, stage gates, decision tree, etc…  In such a discussion you talk about the vagaries of “enabling your enterprise” and “collaborative global solutions.”  All of this is oriented towards a single message: buy our tools.

The Real World

PADT is fortunate enough to not only be a company that sells simulation tools, we use them as a service to help our customers drive product development. We also use simulation to drive product development that we do here at PADT. (WAH? PADT does product development? Yes we do. And rapid prototyping. Click the links to learn more.)

Top this off with the technical support and mentoring that we offer our simulation customers and we are able to get a pretty good idea about the reality of SDPD. And that reality is that SDPD really works, it can make a huge difference in many areas.  But the reality is also that SDPD needs to be done correctly to make it effective.

Why SDPD is Effective

To understand the real world impact of SDPD you have to step back and look at what developing a product is about. There are a lot of different processes, and people get all “burn the heretic at the stake” over there particular flavor.  But they all share some common characteristics:

  1. Define what you want the product to do (specifications)
  2. Come up with and capture all of the things that define the product (design)
  3. See if you ideas work (test)
  4. Fix stuff that didn’t work (iterate)
  5. Make it (manufacture)

Every step in the process involves people asking questions and answering them.  How big, how strong, how long, how much this or that?  And each question can be answered in many different ways. Things like experience, calculations, comparison to existing solutions, statistical studies, testing, and many more.  The cost and correctness of how those questions are answered has a direct impact on the cost and speed of a development project.  Also, many studies have shown that the sooner in the schedule that you answer those questions, the more efficient your project is.

What is great about simulation is that it allows you to answer questions quickly and accurately.  Working in a virtual environment on the computer you can combine comparisons, testing, calculations, and statistics in one place with speed and very little capital investment. The fact that you can do it so fast also allows you to avoid making assumptions and simplifications that reduce the accuracy of the answer.

The most comprehensive study on the effectiveness of simulation for driving product development can be found in “The Impact of Strategic Simulation on Product Profitability” from the Aberdeen Group.  It shows that best-in-class companies across industries are companies that use simulation to drive their product development.

The study finds that:

There is no point in the design process where companies do not profit from intelligent decision-making. By integrating simulation analysis from the earliest stages of design, the Best-in-Class are able to make better decisions through the process. This enables these leaders to drive higher quality and lower cost products, as well as deliver the innovations and features that differentiate their products.

Making SDPD Effective for Your Organization

So companies make more money using simulation to drive their product development.  It would be nice if it was true that all companies that use simulation automatically see a benefit.  But we are talking about the reality of SDPD and that reality is you have to have the proper simulation tools, and you have to use them effectively.

The Right Tools

As far as tools go, you should know where I stand.  ANSYS, Inc’s products. If you are reading this you are probably an ANSYS, Inc. product user or you got this posting from someone who is.  Why are these tools the leaders across the industry? Because they have breadth and depth so you are not limited by your simulation tools, they are accurate, and they work together so you do not have to jump through hoops to work as a team.  That is really all there is to it.

If you can not use this tool set for some reason, say your senior manager is married to the competition’s local rep (which is maybe one of the few valid reasons) you still need to make sure you stay high end.  Do not cheap out on a CAD based tool or a low end tool that is “good enough for what we need.”  Anything other than a full function tool suit will limit your ability to get accurate solutions, or to model your product completely.  That $20,000 you saved will get eaten up in about a week of fumbling around trying to get useful information.

Yes these tools cost a lot more than the low cost or CAD based alternatives. But there is a reason for that.  It is the army of developers, support engineers, and product managers that work day in and day out to improve the speed, accuracy, and capability of their simulation tools.  The reality of simulation is having 80% is only good 80% of the time. When you need that extra 20% of functionality, you need it. And when you do not have it, your project bleeds cash.

Effective Application

Deciding to drive you product development with simulation: easy.  Deciding on the right tool set: a bit of work, unless you just go with ANSYS products, then it is easy.  Now you have to make it work.

This is such a big topic that we did a seminar on it about two years ago.  I’ve uploaded a PDF of the presentation if you would like more details.

The gist of it is the following four rules:

  1. Establish goals for SDPD in general and establish goals for each project that uses simulation.  Without goals it is easy to do too much simulation or to do the wrong simulation.
  2. You must have the right type of users doing the right tasks: experts and mainstream users. Also, do not turn good engineers into bad users by violating the other rules.
  3. Use the right tools. Not just the simulation software, we covered that.  You need the right hardware, the right support, and the right utility software to support your efforts.
  4. Design the right flexible process for your team and constantly improve on it.

Mainstream

I have been driving product development with simulation for over 25 years, and many people who read this blog have been doing it for longer. Once a secret of the aerospace and automotive industry, SDPD is now mainstream. We have customers that use it to design ear buds, mining equipment, coolers for organ transplants, and toys.  It is used to make almost every electronic device around us more reliable, cooler, and faster.  And we still have people that use it to design Turbine Engines, space craft, and automotive components.

In fact the industries that are long time users are increasing their seat count and the size of the computing systems.  Many that we know of are making multi-million dollar investments every year and growing that investment year over year for a simple  reason, they see results from driving more and more of their design process with simulation.

If you are not using simulation, or some portion of your company is not using simulation, than something is wrong. You or they are literally leaving money on the table and giving a competitive edge to the competition.  If you would like to learn more about how PADT and many of our customers have been successful with simulation, feel free to contact me. Or just get out there and start evangelizing something that has already been proven to work.