Just saw a news article that a consulting company called 01 Consulting just did a study of the CAE market. Unfortunately the report is $2,850 so, no matter how interested I am in the results I’m not going to buy one.
Since PADT has been building the CUBE machines (www.CUBE-HVPC.com), we have developed a need for clear-cut instructions for running parallel benchmarks with ANSYS software tools on the CUBE machines (which come standard with the CentOS Linux 64-bit operating system). The computational fluid dynamics (CFD) solver ANSYS CFX is one such tool. I have summarized instructions here for starting the CFX Solver-Manager on a Linux machine, loading the *.def file, configuring it for a parallel run, and then running it.
Step 1: Place the benchmark input file (*.def file) for CFX in /home/user.
We’ll assume the benchmark file is the standard CFX airfoil benchmark:
*.def file = “perf_Airfoil_R12_PADT.def”.
Step 2: Start CFX on the command line: type “/ansys_inc/v121/CFX/bin/cfx5”.
Figure 1 – Command to launch CFX
Step 3: Set the Working Directory / Start the Solver Manager
a) The CFX Launcher will display based on the command entered in Step 2.
b) Check that the “Working Directory” corresponds to /home/user (red rectangle in Figure 2). If it is something different, select the folder icon contained in the red rectangle in Figure 2, and navigate to /home/user.
c) Select the “CFX-Solver Manager” (black rectangle in Figure 2)
Figure 2 – Set the Working Directory and open the CFX-Solver Manager
Step 4: Define Run
a) Select “File – Define Run” (Figure 3)
b) Check “Show Advanced Controls” (Figure 3)
Figure 3 – Define the Run
Step 5: Read the *.def file
a) Select the folder icon (Figure 4).
b) Pick “perf_Airfoil_R12_PADT.def” (Figure 4).
c) Select Open (Figure 4).
Figure 4 – Read the *.def file
Step 6: Set the Run Mode
a) Select the pull-down next the “Run Mode” (Figure 5).
b) Pick “HP-MPI Local Parallel” (Figure 5).
Figure 5 – Set the Run Mode
Step 7: Partitioner Settings
a) Go to the “Partitioner” tab (Figure 6)
b) Under “Partitioning Detail” select “Multidomain Option – Coupled Partitioning” (Figure 6)
Figure 6 – Partitioner Settings
Step 8: Add partitions and Run
a) Click the “plus” sign to add partitions (the number of partitions should be the same as the number of processors for the run) (Figure 7).
b) Select “Start Run” (Figure 7).
Now, your job should start up and you will see the residuals/imbalances:
For a couple of releases now you have been able to use the very common scripting language python to script applications that are considered Workbench native: the project page, Engineering Data, DesignXplorer, and the Parameter Manager. This is pretty cool and something that we will cover in a future The Focus article. But one thing that most people don’t know is that python also works well with Microsoft Excel. So if it works with Excel and Workbench, you should be able to use it to connect the two.
Now before you get too excited, realize that ANSYS Mechanical, CFX, FLUENT, CFX Post and most other solvers and post processors that launch from Workbench are not workbench native applications, so they don’t speak python. So no, this is not a replacement for PADL or CCL. They are called “data integrated” applications in that they share data through the Workbench interface, but they have their own software structure that is unique, and their own scripting language.
So to take advantage of this python commonality you have to connect to one of the native applications, and the parameter manager makes the most sense. And fortunately the kind technical people at ANSYS, Inc. have provided a great example in the help system to do just that. And in doing so they have cut the length of this article in half.
ANSYS Documentation Example
You can find the example under: Workbench // Scripting Guide // Using Scripting in ANSYS Workbench // Usage Examples // Updating a Workbench Project and Parameters from Excel. It is a very simple example, a cantilever beam, but it shows the basics and is a great place to start.
Read the manual page completely, especially the notes at the bottom. They point out the key things you need to know. The files you need are in the documentation, but you need to dig up the excel/workbench files and copy the python script and get rid of the line numbers. I’ve put them all together in a zip file to make things easier:
There are a few things worth discussing on this example before we move on to something more complicated.
How to Run It
One thing the example leaves out is the details on how to run it. To start, you need to copy all the files to one directory: the python script, the workbench database, the workbench files, the excel file. Then (it does point this out) you need to set the working directory variable in the script to the directory with all the files. Now it is time to run. To do this open up Workbench with a new project (or File->New if you already have Workbench open). Now run the script with File->Scripting->Run Script File…
This will launch Excel and Workbench and add the command to execute the update script when you push the button. So, what next? Change the value for Bar Length or Load and press the button. This should cause Workbench to update and the Maximum Deformation value to change to Updating… and then the new value when the run is done. And that, in a nutshell, is the basics of making the two tools talk.
How it Works
The documentation does a pretty good job of explaining the details. The important take away is that you have to do a couple of key steps in any program that is going to do this sort of thing:
Tell python to load the Excel stuff
Set up the working directory, the routines all need absolute paths to work
Grab cell’s from excel, grab params from Workbench, and set one to the other as needed
Do an update
Provide a mechanism to start the process (the button in this case)
If you have these steps, you can make Excel and ANSYS talk.
Where Things Run
An important thing to get your head around is where the script is running, or more accurately, what process is reading and executing the script. The python interpreter is actually in the Workbench environment. When you run the python script you are not running it in Excel or in the operating system. It is running through a interpreter inside Workbench.
This is important because it controls what information is automatically available to the script as well as how you launch it. In this example, you launch from Workbench and it starts up Excel, does stuff in Excel, and then assigns a python routine to the button. When the button is clicked, you are still running a python script in Workbench – it is just talking to Excel.
Limitations in the Example
There are two big limitations with this example when it comes to using it on a real problem. First, it is not general enough because you have to hard code in file names, directory locations, cell locations and strings. This is not a huge deal because if you are going to use an excel based simulation tool over and over, you only have to set it up once and go.
The other problem is that the action to update the spreadsheet and model is assigned as an OLEObject to the button. Unfortunately this is only active when the script is running. Once it is done the click action is removed, and the button becomes useless… allowing for only one use of the button. This is a significant issue.
A More General and Complex (Sort Of) Example
A common potential usage for this sort of tool would be to provide a user with an Excel based tool that they would use to design a part and see the results. For this example we are going to use a very complicated tool for designing towers with a rectangular cross section. The input variables are the length, width, and height of the tower as well as a pressure applied to one face. The output variables are the deflection from the pressure and the first 10 natural frequencies.
Of course a real example would have more complex input and output values, and the spreadsheet would probably do a lot more calculations, perhaps even optimization.
This example avoids the button problem by actually running Workbench from Excel by launching it from the command console and supplying the project file AND the python script as startup parameters to Workbench. Workbench loads the project, runs the script, then exits.
The spreadsheet is called TowerTool1 and it is a macro (xlsm) spreadsheet. You can download it here:
The sheet is not that different from the ANSYS provided example. It has cells for input and output parameters. It also has a button to update the system. The first thing that is different is that it uses named ranges for the input and output cells. To see the names go to Formulas->Defined Names->Name Manager. I used “Create from Selection” to automatically assign the text in Column B to the values in Column C an D for each row.
Also note the units. Because Workbench parameters (with the exception of Design Modeler parameters) are unit sensitive, you can specify the units in those columns and use them to do unit conversion as needed. In this example only the units on the Pressure are implemented to show how this is done.
There is also a little Drop Down Yes/No choice for the user to decide if they want to save the project file or not when they are done running.
The button is also different from the ANSYS example. Instead of being assigned an action by the python script, it executes a very simple Visual Basic routine:
Private Sub CommandButton1_Click() retval = Shell(“C:\Program Files\Ansys Inc\V130\Framework\bin\win64\runwb2 \ -X -R updateWB2.py -F ExcelTower1.wbpj”, vbNormalFocus) End Sub
The line uses the Shell function to execute a system level command in Windows. The command it runs is the command line way to start Workbench.
The –X says run interactive and exit when any given command line options are completed.
The –R gives a python script to be run, and this is our python script that is presented in detail below.
The –F specifies that project file to be executed. As an aside, you could make this program more general by having a cell in the spreadsheet hold that file name and then substituting that value in the execution string.
The ANSYS Model
This extremely complex model can be found in the following archived project file:
The model contains a DesignModeler solid, a static structural solution, and a modal analysis. The key aspect of the model are the parameters.
The outline for the parameter manager looks like this:
The various parameters were set up in each system in the normal way, by clicking on the parameter box next to them for each value you want to supply or return. The key thing to note here are the Names (P1-P15) for the parameters, because this is what our python script will use to change or retrieve values before and after the model update.
The actual script we are using is shown below. Comments have been added to describe important steps. You can download it, without the line numbers, here:
Please don’t laugh too hard. I ran out of time to convert the code into something more concise using do loops and variables. The whole thing could be made much more general if you know python better than I do.
Execution is pretty simple. When the user pushes the Update system button the program goes off and launches Workbench with the script as input. The script changes the parameters, updates the model, saves if asked for, updates the values in the spreadsheet, then Workbench exits.
One key thing to know is the issue with working directories. The Excel file needs to be in the same directory as the script and the project. Sometimes Excel changes the working directory (say you opened a different spreadsheet). If you get an error on execution that Workbench can’t find the project file, you just need to go back to Excel and do a Save As in of your file on top of itself. That fixes the problem. More advanced users can use VBA to control directories.
The first resource for doing this type of work is the ANSYS Workbench->Scripting Guide in the online help.
ANSYS uses a flavor of python called IronPython. They have a great web site at:
Both examples presented are pretty simple, but they have the core of what you need to drive Workbench from Excel. It really is not that hard to do and can provide a nice interface for users who don’t know Workbench.
Some areas for improvement and suggestions for a real application are:
Use jScript in Workbench to add a button that runs a script that attaches to an open Excel file and update based on the current values in that Excel file. So instead of running Workbench in batch from Excel, pressing the button would update everything. That seems to be a more effective way to do it. (maybe a quick article in the future if Matt help me figure out how to add a button quickly…)
Do more with VBA or Python in the Excel file to make you Excel based tool more sophisticated.
There is no error checking in this example… very bad. For a real world tool a lot more code should be added to check things and to capture errors from various functions.
As one can imagine, this can be extended to other programs. A good example would be an optimization tool or another “calculating” tool like MathCAD or Matlab.
Twice in the last week we’ve taken tech support calls in which the users questioned why their stress or strain results were being reported differently in Workbench Mechanical vs. the results from the same results file in /POST1 with ANSYS Mechanical APDL. After answering those questions it was pretty obvious that a Focus blog entry was in order. All we needed was a good, relevant example to demonstrate the issues and the explanation.
First glimpse of approaching dust storm. All photos by the author.
In case you missed it, the big story here in the Phoenix area this week was our monster haboob, or dust storm. If you’re not familiar with the term haboob, Wikipedia explains it here: http://en.wikipedia.org/wiki/Haboob. In order to have a humongous dust storm, you’ve got to have wind.
Stopped in a parking lot to take these pictures – the only camera I had was my phone.
About 3 minutes after this picture was taken the dust storm arrived.
Wind tends to cause damage, but although our recent dust storm is estimated to have been 100 miles wide and up to 10,000 ft. high, we fortunately did not sustain much significant wind damage. Things that do tend to get mangled, however, are deployed patio umbrellas and portable expanding sunshades. These sunshades typically retract into a compact size and fit in a zip up carrying case. Many of us have collections of damaged sun shades that still work via creative application of duct tape, wire, etc. These inexpensive shades work great for keeping the sun off of us during birthday parties or other outdoor gatherings, but high winds tend to cause the metal members to bend and break, causing the shades to need some field engineering repair if not just a one way trip to the dumpster.
Here is a solid geometry representation of a typical portion of the frame of a representative shade. It consists of two rectangular hollow members, pinned to each other at the center, with pins at each end that in the full structure would be attached to additional components.
For simplicity, we fixed the pins on the right side to ground, while those on the left side have an upward bearing load applied to the upper pin and a downward bearing load applied to the lower pin. These loads tend to cause the members to bend at the central pin. The bearing loads in our example represent the effect of a strong gust of wind hitting the fabric canopy above the frame, with the load eventually reacting through the frame to stakes that attach the frame to the ground at the bottom. The main thing to note here is that the applied load is large enough to cause significant plastic deformation, not unlike what one might experience in the real world when one of these structures is subjected to a very strong wind.
Workbench Mechanical, Coarse Mesh, Peak von Mises Stress is 79,219 psi
Same Results File in Mechanical APDL /POST1, Peak von Mises Stress is 83,873 psi
The issue here is that for our initial run with a very coarse mesh, when we view the von Mises stress results in the Mechanical editor and then compare them with the von Mises stress results obtained from the same results file in /POST1 in Mechanical APDL, we notice there is a difference (79.2 ksi vs. 83.9 ksi). Why is that? It has to do with how stresses are calculated. First let’s consider Mechanical APDL and /POST1. The original graphics display system is known as Full graphics. Fifteen or twenty years ago ANSYS, Inc. developed a newer graphics display system for MAPDL known as Powergraphics. There are several differences between these two display systems which affect results quantities.
ANSYS Mechanical APDL uses PowerGraphics by default, which among other things
only looks at results on the exterior surfaces of the model. Full Graphics, on the other hand, includes
interior elements in addition to the exterior surfaces when displaying results plots. Another difference is that with Powergraphics we can vary the number of element facets displayed per element edge with the /efacet command. The default is one facet per edge but for midside-noded elements we can increase that to two or four. With Full Graphics we are stuck with one facet per element edge. Workbench Mechanical uses an algorithm whose results tend to compare more favorably with full graphics, although it apparently displays with 2 facets per element edge. Another option in MAPDL is to plot nodal (averaged) vs. element (unaveraged) stresses.
So, which of all these methods is the correct one? I would consider them all to be correct, just different. However, we can use the difference in
results as guideline for our mesh density (as well as the presence of singularities). If there is a significant
difference between PowerGraphics and Full Graphics results in MAPDL, this usually indicates the mesh is too coarse, at least in our region of interest.
As the mesh is refined, the difference between the two calculations should decrease. In Workbench
Mechanical 13.0, we can plot averaged and unaveraged stress and strain result plots. The choice is made in the details view for a given plot. A significant difference
between these two quantities also indicates that mesh refinement is needed.
In our shade frame model, we can see that as the mesh is refined, the difference in von Mises stress results decreases, as shown in the table below.
A similar effect is seen with the von Mises plastic strain results:
Regarding the mesh densities used, the coarse mesh had an element size of at least 0.05 in. on the member hole at the high stress/strain location, while the fine mesh had an element size of 0.025 on the same hole. Another way to look at the mesh refinement is that the coarse model had 20 elements on the hole of interest while the fine mesh had 104 elements on the same hole. Clearly the coarse mesh in this example was way too coarse for engineering purposes, but this was selected for this article to ensure the effect of different results calculation methods was significant and observable.
So, the bottom line here is that if you see unacceptable differences in stress or strain results using different results calculation methods, it likely means that your mesh, at least in the area of interest, is too coarse. Try adding mesh refinement and check the results again. In Mechanical, you can try adding a Convergence item to a scoped result plot to at least partially automate this process. Just be careful that you don’t include any singularities in your desired convergence region.
If you were expecting a reference to the Kansas song, “Dust in the Wind,” well, I guess this is it. Fortunately we don’t seem to have many lingering affects of the big dust storm. The parking lot here at PADT has a thin layer of dirt that’s gradually disappearing. Once we get a good rain it will all get washed away.
I first started contemplating this article several weeks ago, and I was planning on somehow working in a Jersey Shore reference. But now that I’ve relocated to Colorado and am recovering from a climbing trip that was a bit above my ability, my creative juices are a little low (they were used up trying to improvise my way up an overhanging roof pitch).
Hmm…I’m noticing there aren’t a lot of foot-holds to get me over there, prussik to the rescue!
Anyways, User Defined results were first introduced in R12 to grant the user access to element table items.
When you insert a user-defined result, you are required to fill in the ‘expression’ line in the details window.
Just like Johnny 5, ANSYS needs input
You can look up everything that’s available through the documentation, but we all know that no one reads the instructions.
Reading is boring…I’ll wait until the movie comes out, I hear it stars the guy who voiced Johnny 5
There’s a much easier way to auto-populate the ‘expression’ line…the Worksheet view! To access this, first click on the ‘Solution’ branch and then select ‘Worksheet’ (tab in R12, button in R12.1 and newer). This will show you a list of all the user defined result expressions. Find one you like, right-mouse-click on it and select ‘Create User Defined Result’.
That’s more like it
This will insert a ‘User Defined Result’, the only work you need to do is scope it to a body (if necessary). The only ‘tricky’ part of this process is that you need to solve the model first before using the Worksheet view. This is because before the model has been solved, Mechanical doesn’t know what is in the result file. So if the worksheet view is blank or grayed out, it’s because you haven’t solved the model.
So what’s the benefit of using the User Defined Results? Say you wanted to look at total strain, kinetic energy, or reaction force contour plots…just to name a few. In order to view any of those, you would either have to open the .rst file in MAPDL or use the User Defined Result.
Left = FX, Right = Total Strain
Here’s a quick description of the ‘headers’ available on the Worksheet tab.
Element Nodal Reaction Forces
Nodal Orientation Values
There are more headers that are listed in the documentation (I know, we all agreed that was boring). However if you’re looking for items stored in the NMISC or SMISC (for ‘regular’ or contact elements), those are accessible provided you properly format the expression line.
So now let’s go through an example where we actually use this functionality. A customer called in asking how to calculate the volume of a part above a specified stress level. Interesting question…
First we create a user defined result and use the ‘VOLUME’ expression. So we’re half-way there. Next, we need to understand about how the results are stored for a result item in the ‘Solution’ branch. Each contour plot is actually a vector in the form of element/node vs result. You can see this for yourself by right-mouse-clicking on the item in the tree and selecting ‘export’. So right now we have a vector defined of element vs volume. Now we just need another listing of element vs stress.
When you create a stress result in Mechanical, the default behavior is to show the nodal-stress, which represent the average of the adjacent elements. That’s not what we want. If we look into the ‘Integration Point Results’ we see there are more options. Element mean seems like it might work for us, but when we export it, we see that it’s still reporting stresses at the node, only now it’s using a different integration scheme.
In order to access the average stress value of the element (not node), we need to use the User Defined Results. We’ll ask to evaluate the expression seqv (von Mises) and set the integration option of using the elemental mean.
Now when we export that result item we get what we want…element vs stress:
Now we just need to export both vectors (volume and stress), then copy/paste/sort/sum and you’re done. Don’t forget the most important step…billing for 4 hours of post-processing work.
Long story short, all you MAPDL users who have been complaining about not being able to access element tables should take a look at the User Defined functions.
· ANSYS Release 13.0 must already be installed on your machine
· Time needed to complete update: 15-30 minutes
Begin ANSYS 13.0 SP2 update procedure
1. Login and download the appropriate Linux version update from the ANSYS Customer Portal.
2. Save ANSYS130SP2_LINX54.tar to your install Desktop
3. Extract the files root Desktop
4. Open a Terminal window session and change directory to your Desktop then to the extracted folder on the desktop. In my example the folder is Save ANSYS130SP2_LINX54.tar_FILES
5. After you have verified that you are in this folder type. ./INSTALL
7. Review the ANSYS Software License agreement and click I AGREE to continue.
8. Verify your installation directory and click Next.
You must install into the ANSYS RELASE 13.0 SP2 into the same location as your original ANSYS RELEASE 13.0 location.
9. Select the components you would like to install. The installation GUI will by default have checked installed selected.
You can choose to install fewer products to update but you would not be able to select products to add with the update package. You would need to install from the main ANSYS RELASE 13.0 media and then install the SP2 on top of that installation.
Please note the amount of disk space required for the update. Approximately 7.2 GB. You will need to make sure that your Linux machine has at least 7.3GB free and available.
10. Verification screens – Dates
11. Installation screens – Various screens will scroll through as the installation manager extracts the package files. Screen shot of Extraction: Package 4 of 16
12. Installation screens – Various screens will scroll through as the installation manager extracts the package files. Screen shot of Extraction: Package 5 of 16
13. Installation screens – Various screens will scroll through as the installation manager extracts the package files. Screen shot of Extraction: Package 14 of 16
14. Completed update – Click Next to begin the ANSYS 13.0 Licensing Client Installation update
15. ANSYS 13.0 Licensing Client Installation – Begin verification
16. ANSYS 13.0 Licensing Client Installation – Configuration log and successful completion of ANSYS 13.0 Licensing Client update.
17. Click Finsh to end update installation script routine.
As indicated in the note on the IMPORTANT note below. Please run the update for the ANSYS, Inc. License Manager SP2 update after completing this procedure. This download is also available through the ANSYS Customer Portal
Recently, PADT has conducted some parallel benchmarks with our Linux cluster. The model used for the tests is an ANSYS FLUENT *.cas file with 26.5 million cells and 6.5 million nodes. The physics in this model are fairly simple; it is modeling external, steady airflow over an object. Simulations of 10 iterations were conducted using as many as 144 processors over three 48-core systems (the machine names are “cs0”, “cs1”, and “cube48”, and are summarized in Table 1). The first two machines in Table 1 (cs0 and cs1) are connected together via Infiniband, so they effectively form a 96-core machine. Furthremore, the cs0 and cs1 machines are connected to cube48 via GigE network ports. Each of the machines has two GigE network ports each which connect to a Gigabit switch to allow for trunking.
Table 1 – Benchmark machine specifications
2.3 GHz AMD Opteron 6176SE
2.3 GHz AMD Opteron 6176SE
2.2 GHz AMD Opteron 6176SE
Two versions of FLUENT were tested; the original version 13 (R13), and the new Service Pack 2 for version 13 (R13 SP2). The results were scaled according to the one-processor solve time to compute the “speedup” (defined as the solve time on 1 processor divided by the solve time on “N” processors), and are presented in Figure 1 with a comparison to “Linear” or ideal speedup (ideal speedup implies that the speedup value is equal to the number of processors for a given processor count).
Figure 1 – Speedup vs. Processors with 6.5 million node ANSYS FLUENT model
The first result in Figure 1 (blue curve) was obtained using as many as 48 processors on the cs0 machine before installing the Service Pack 2 for FLUENT version 13. As illustrated in Figure 1, the speedup values of the blue result decrease as the number of processors increases. The next result (green curve with triangles) was calculated using Service Pack 2 for ANSYS FLUENT version 13 on the cube48 machine using as many as 48 processors. This result on cube48 presents perfect correspondence with Linear speedup, and even displays some values which are “super-linear”. This behavior led us to suspect that the improvements in Service Pack 2 for ANSYS FLUENT version 13 were the primary cause of the increase in performance demonstrated by the cube48 speedup result.
However, further testing of the cs0 and cs1 machine with Service Pack 2 for ANSYS FLUENT seems to suggest otherwise. These data are represented by the red squares in Figure 1. Runs were conducted on as many as 144 processors, which involved a distributed run using 48 processors on cs0, 48 processors on cs1, and 48 processors on cube48. The general trend of this result is the same as that recorded on cs0 before installing Service Pack 2 for ANSYS FLUENT version 13, suggesting that some other effect (likely machine-related) is present. Our current hypothesis is that the socket 2 processor (which handles the Infiniband UIO card hardware) is causing the slowdown, primarily because the Infiniband switch is present only on the cs0 + cs1 machine, and not on the cube48 machine. This was suggested by the manufacturer of Infiniband switch (SuperMicro). Testing to asses this problem is on-going.
We just finished up a tech support call for a customer that wanted a way to get accurate reaction loads from a PSD run in ANSYS Mechanical. Alex Grishin took the call and provided a nice example to the customer, so we thought we would leverage that and share it with all of you. Even if you are not in need of this particular function, it is a great example of using snippets. If you are not familiar with this, check out our recent webinar on the subject.
The reason why you have to do this is because doing an accurate PSD force calculation is not a simple thing. The math is a bit complicated, because PSD responses are probabilities of results that loose sign. And it is right now only available in Mechanical APDL (MAPDL). This is not a problem because we can use an APDL command object to get the results from MAPDL and bring them back to ANSYS Mechanical.
Three Simple Steps
There are three very simple steps needed to get this done:
Identify the geometry you want the reaction loads calculated on Do this by selecting a face, edge, or corner and create a named component. You will use that named component to grab the nodes that sit on the piece of geometry and do an FSUM in MAPDL. In our example, we call the named selection react_area1.
Tell the solver to store the required modal information Since ANSYS Mechanical doesn’t do reaction force calculations they save disk space by not storing the info needed for such calculations, but we need them. So add a command object in your modal analysis environment that says save all my results (outres) and expand all my modes (mxpand): outres,all,all mxpand,,,,yes,,yes
Calculate the reaction force Now we simply need to add a command object to the post processing branch that:
gets the PSD deflection results (set,3)
selects the named selection (cmsel),which is a nodal component in MAPDL,
gets the nodes attached (esln)
calculates the reaction load (fsum)
stores the results in parameters that we return to ANSYS Mechanical. (*get,my_). Remember that anytime you create a MAPDL parameter in the post processor that starts with my_ it gets returned to ANSYS Mechanical. (well, that is the default, you can change the prefix)
select everything so that MAPDL can keep post processing like normal
For our example, it looks like this: /post1 set,3,1 cmsel,s,react_area1 esln fsum *get,my_fsumx,fsum,0,item,fx *get,my_fsumy,fsum,0,item,fy *get,my_fsumz,fsum,0,item,fz allsel
The following figure shows the model tree for our example, and the returned parameters:
Nothing fancy, simple in fact: Make a component, store the required info in the result files, do an FSUM and bring back the results.
Over the years I have learned to do more with less. When it comes to information systems world, you all know the equation is often much more with much less. One of my to-do’s over the years, umm that continued to get bumped down on the priorities list. It is the juggling act of making sure that the data center has enough cooling vs. power vs. yes again in AZ cooling. If you are an IT professional or even an Engineer you really don’t have time to attempt to try to convince someone, anyone that we need to speed more money. Even if you use the effective philosophy of Time, Money and Quality. After dealing recently “wish ware software vendors this past year” I added a fourth dimension to the above philosophy. It’s called Functionality; here is what Merriam Webster had to say about Functionality. The quality or state of being functional; especially: the set of functions or capabilities associated with computer software or hardware or an electronic device. http://www.merriam-webster.com/dictionary/functionality?show=0&t=1307146105
Time – Try searching the internet for terms such as data center cooling calculator, data center cooling costs or how can I save money with our data center cooling? You will suddenly have millions of search queries at your beckon. From video blogs on data center cooling, white papers on optimizing IT strategy on data center cooling. It is endless 3 million plus hits on just one search term data center cooling calculator. Wow, Start researching my young padawon learner, fill out those lead generating white papers. Keep it simple…
Money – I would prefer that we used or dollars on buying a server with a couple INTEL XEON ET-8870 processor. Or how about a QUAD based AMD FX-8130P processor server! I do not have any budget for a I am sure fabulous data center cost benefit crisis analysis.
Quality – Can I even understand what the end-result document white paper will read. Will I look like even more of an idiot? This needs to be accurate information. I will have to do it myself or use a third party data center analysis.
Functionality – Will our current air conditioners hold up this year? What about when we hit 120 degrees? Oh my do I need to add more cooling power?
Wikipedia on British Thermal Unit (BTU) – http://en.wikipedia.org/wiki/British_thermal_unit
So why I care about a BTU? Because approximately one "ton of cooling", which is a typical way people talk about cooling devices in the USA, is 12,000 BTU/h. This is the amount of power needed to melt one short ton of ice in 24 hours. Locked away in a climate controlled vault is one of my data centers. “Said such vault may or may not contain the following items on any given day.” After all this is a mobile compute server world these days.
174 Cores (Mix of Windows/Linux servers)
2 – Network Routers
3 – Network Switches
Voice Mail System
1 LCD 20” monitor/KVM
Go Green in the Data Center! First, let’s get a “Cool Grip” on your data center…16,484.058 BTU/h
A couple years ago one of the our ANSYS Mechanical Simulation Engineers named Jason Krantz told me about a watt meter device. A handy little watt meter monitoring device designed by P3 International KILL A WATT™. Over the years that little watt meter devices has become one of closest friends and ally in IT. Today, I was able to quickly asses (realistically about four hours of time) just how many Watts of power each one of our servers, network devices, etc. used. I tried to be as accurate as I could without having to take out a second mortgage. So I made sure and verified that one of our PHD FE Analyst or CFD Analysts had our servers at our near 100% CPU use.
YOUTUBE VIDEO :: Check out this real-world example of a AMD Opteron 6174, 287 hour electrical cost usage test. The data shown in this video is of a server that has four AMD Opteron 6174 processors installed.
AMD Opteron 6174 Electrical Cost Usage
So, what is your magical number? Ours is 4,831 Watts
Do you know how many watts of power your server room is using? could you even logically guess what that number is? Our magical number for server room #1 turned out to be 4,831 Watts. I do need to state that I was unable to take some of the devices offline. When that was the case I used data pulled from the actual technical documents of the device’s manufacture website.
So what is your BTU/h number? Ours is 16,484.058 BTU/h? Oh and I don’t even like math? I know, I know, math was solved and perfected centuries ago. But how do I convert Watts into BTU?
I used a 99 cent app that I bought off of the iTunes App store called “Convert Units”.
Step 1 – Convert Watts into BTU/m
Step 2 – Then multiplied by 60 to get that value into BTU/h.
Step 3 – Speak to your Operations Department, send an email, shout from the rooftops!! We need at least a two ton Air Conditioning unit for Server Room #1
Now with the precious BTU/h value in hand. I was able to speak the same language as that of our Director of Operations & Facilities manager.
I wish you all could have been there when I walked up to Scott and told him the news. The dialogue went something like this:
“Scott I wanted to talk to you about server room #1’s cooling situation…(pause for dramatic expression). Almost immediately you could see Scott’s blood pressure rising. Scott’s brain quickly churning through mountains of Air Conditioning cooling information and data. I quickly calmed his anxiety and said these exact words. “Server Room #1’s BTU/h ratio is approximately 16,484.058 BTU/h.” It took Scott just a moment for this bit of information to register. I do believe that I actually heard the hallelujah chorus in the heavens. I could also see the peace that passes all understanding come across Scott’s face. It was if I could read his mind and he was thinking how is this non-operation/facilities type humanoid speaking my language? For Scott knew immediately that he had enough cooling power at this moment into to cool that data center down all summer long.
DATA CENTER #1 – 274.7343 BTU/m*60 = 16,484.058 BTU/h
How the heck are you making money today? Step out of the Box, Step Into a Cube Computers for CFD and FEA Simulation. http://www.cube-hvpc.com/
It is a fact. Microsoft Excel is the most used engineer tool in the world. If you are like me, you do everything you can’t do in ANSYS in Excel. And a lot of the time you wish you could talk directly from Excel to ANSYS – and in the day many of us wrote kludgey VB Macros that would write APDL scripts run ANSYS MAPDL in the background. In the past couple of releases our friendly neighborhood ANSYS developers have added a lot of different ways to work with Excel: saving tables to a file, Python scripting to talk to Excel, and an Excel System. I remember reading about these things as they came out, even wrote about how cool they were, but I never had the opportunity to use them.
Then, last week, I noticed an Excel icon just sitting there in the toolbox, mocking me, taunting me to use it.
So I dragged it out on my project page and tried to use it… and got no where. My assumptions were not valid, it didn’t work the way I expected because, it turned out, I was not thinking about how it fit into the project correctly. So I backed up, actually read the help (gasp!) and after a experimenting got it to work, I thought. Then I talked to some folks at ANSYS, Inc. and they reminded me of what this is: an Excel Solver System. It is not a tool that lets you “drive” your project from Excel. It lets you use Excel to calculate values.
This posting is a summary of what I learned. And as I was working through it I thought it would be good to also cover the python interface to Excel and how to save tabular information to Excel. These will be covered in future articles (hence this being Part 1).
What You Need to Know
As I said above, my problem was that I was thinking about how Excel fit into my project all wrong. The first thing you should do is read the help on the Excel System. The best way to find it is type excel into search. The item with the most hits will take you to the article for component systems, then click on Microsoft Office Excel. (I wish I could just put a link in… grumble… grumble… Instead, I leave the link to buy MS Office 2013 for those who does not have it).
To use the Excel system you do the following:
Add the System to your project
Make a spreadsheet and use range names to identify parameters
Attach an Excel spreadsheet
Edit the system and tell the program which parameters are input and which are output
Go into the parameter manager and hook up any derived parameters you want to pass to Excel and use any of the Excel parameters with other parameters as needed
Tell ANSYS to run a VB macro (if you want)
Update your project or Design Points
We will go through the process in detail but first, a few things you should know:
The system kind of looks and feels like the parameter manager in Workbench, but it is not. You have to think of Excel as a “solver” that feeds parameters from and to the parameter manager.
I struggled with this because I thought of output parameters as values calculated by Workbench and input parameters as ones that come from Excel, but the opposite is true.
Excel Input Parameter: A value calculated in Workbench parameter manager
Excel Output Parameter: A value calculated in Excel
You need to get your head around this or you will get stuck like I did. The example should help.
Parameters that come from DesignModeler are dimensionless in the parameter manager.
This one really held me up for a while. If you assign a parameter from Excel that has a unit to drive your geometry in design modeler, you get an error.
The solution is to make sure that you DO NOT use units on Excel parameters that you get or pass to DesingModeler
When you attach the Excel file to your Excel system on the project page Workbench copies the Excel file to your project and buries it in dpall\XLS.
You will get burned by this if you go to your original excel file, edit it, then try and update your project. Your changes will not show up. That is because it is not linked to the original file, it is linked to a copy stored in that XLS directory.
Once you have linked a file you should exit Excel then open the file by RMB on the Excel system and choose “Open File in Excel” (see below for more on this whole process)
I recommend that you start buy making your Excel file, save it with the name you want in the C:\Temp directory, attach that file, close Excel, then open from Workbench.
Now you have a file to add your stuff to and you don’t have to worry about having an earlier version lurking around.
An important side effect of this is if you delete your system, it deletes your Excel file! So make sure you make a copy or do a save as before you remove the Excel system
To get changes in Excel to show up in your project, you need tosave the file AND refresh/reload.
Making a change to he Excel file will put the system out of date. A refresh on the project page or a reload on the “Edit Configuration” page will update things.
The parameter names in Excel are case sensitive. So whatever your prefix is in the system properties (WB_ by default) you need to have the same in your Excel spreadsheet for range names.
To get a full update, including running any macros and doing any calculation, you have to update the system. This is kind of obvious, but I kept forgetting to do it.
Your Excel file will not update if you use RSM. Make sure your default for updating your project is to run local and, that if you are using design points, you set that update to run in the foreground.
The easiest way to check and change this is to click on the parameter bar and view its properties. Under Design Point Update Process set Update Option to Run in Foreground.
If you want to have your Excel file define both input and output parameters for the same ANSYS simulation, workbench sees that as a “cyclic dependency” and will not let you do it.
Although annoying at first glance, it kind of makes sense. If you feed a value to Excel and then Excel calculates a new value that effects your ANSYS model, you need to update the ANSYS model, which will change the value that gets passed into Excel, which will change the value that gets passed out which changes your ANSYS model, which… and so it goes in a loop. This is considered a bad thing.
This goes back to the fact that Excel should be used as a solver, not as ‘”driver” of you simulation.
If you do want to drive your analysis from Excel, you’ll need to do some scripting. We’ll cover that in a future article.
I started this article with a really cool valve demo model. Then found that it was just too slow and a pain to work with for showing how the Excel system works. So I went back to my second favorite type of model, a simple “tower of test.” (my favorite is a FPWHII – flat plate with a hole in it). You can download the project here.
Add the System to Your Project
Like every other system in Workbench, you simply drag from the toolbox to the Project Schematic. Notice how the green “drop zones” are all empty spaces. You can’t drop it on an existing cell in a system because there is no dependency between other systems and an Excel system. The Excel system is connected through parameters, which we will see in a bit.
Once you have dropped it onto the schematic, click on the Top cell (C1 in this case) and check out the properties (RMB Properties if the window is not already open). From the properties you can see the system ID (XLS) and you can specify an Analysis Type. You can leave it blank or type in something like “Home Grown Optimization.”
Then click on the Analysis cell (C2 in this case) and look at the properties. They are shown here:
One key thing to note is that the directory where the Excel file will be copied is shown. I did this once already on this project so it made a XLS-1 directory. If I did it again, I’d see XLS-2, etc… In fact, by the time I got done with this article and trying all sorts of things, it ended up in XLS-8.
The most important option under Setup is the “Parameter Key” Any Excel named range that begins with this string will get read into the parameter manager. If you make it blank, all the named ranges will come in.
Make a Spreadsheet and Use Range Names to Identify Parameters
Now you need to create your spreadsheet. You need to plan ahead here a bit. Figure out what parameters you need Excel to get from your models and what parameters you want to send back. Come up with good names because that is what gets passed to Workbench.
What happens when you attach a file is that Workbench goes to the Excel sheet and steps through all the named ranges in the file. If it finds one with a name that starts with the filter value, it grabs the first value in the range as the parameter value and then grabs the second as the units. If your range is bigger, it just ignores the rest.
So this tells us that we need to create a range that has at least one cell, or two if units are important. For our simple example we will be calculating costs and outputting that using the input Volume, Length and Width. There is a formula in the cost cell that multiples those values times pre-set costs per unit volume, length and width and sums them up to get a cost.
So the laziest thing you can do is select a cell and name it.
But it will help you and others if you actually make a table that has a descriptive name, the parameter name (WB_ should be your default), the value for that parameter, and the units, if any. Note that for an input parameter you can just set the value to zero to get started. Here is what the tables look like for our example:
To create a range you select the value and units for a given parameter, hold down the Right-Mouse-Button (RMB), and Select “Define Range”
A cool thing that Excel does is to use the value just to the left of the range as the default name of the range. So by creating the table you save yourself some typing. Or, if you don’t use a table, just type in the name you want .
Now just click OK and you have a named range. You can repeat this for each range, or you can get fancy and use the fact that your data is in a table, with the parameter name to the left, to quickly generate all the ranges at once.
To do this, select the WB Param, value and unit columns. Then go to –>Formulas –> Defined Names –> Create from Selection. When the dialog box pops up make sure only “Left column” is checked. Click OK.
In one fell swoop you created all your named ranges. To see, edit, and delete ranges, regardless of how they were created, go to Formulas –> Defined Names –> Name Manager.
Take some time to look at this and understand it. When you are debugging and fixing stuff, you will use this window.
Now you have an Excel file that Workbench will like! Time to attach it. Save it (I recommend to save to temp so you don’t get it confused with the copy that Workbench will make).
Attach an Excel Spreadsheet
This is the easiest step. Simply RMB on the Analysis cell in the system and browse for the file.
Now your Analysis cell has a lightning bolt, update to have it read the file and find parameters. If you have your parameters set up wrong, such that you don’t have any named ranges with the specified prefix, it will generate an error but will still attach the file.
NOTE: If you get some weird errors “Unexpected error…” and “Exception from HRESULT:…” when updating your Excel system, check your Excel file. Odds are you have an open dialog box or the file is somehow locked. The error generates because Workbench can’t get Excel to talk to it.
Edit the System and Tell the Program Which Parameters are Input and Which are Output
Although you have a green check mark, you will notice that your system is still not connected to your parameters, and therefor it is not connected to the rest of your model. The way to fix this is to RMB->Edit Configuration. Double-clicking on Analysis also does the same thing.
This puts you in Outline Mode. You should be familiar with this mode from the Parameter Manager or Engineering Data.
Take some time to explore this outline. Notice the setup cell, where you have access to the system properties. Then it’s child, the Excel file. Click on it to the properties for the file connection. Under that is the important stuff, the parameters.
If you did everything correctly, you will see all of your parameters in alphabetical order. If you click on one, you will see the properties. Here they are for the cost value:
It shows the range, the value and units (C column) and the Quantity name. Workbench guesses by units. So PSI comes in as pressure by default. If it is a stress, you need to change it here.
But your main task right now is to tell Workbench which of these parameters you want passed to the parameter manager, and what type of parameter, input or output, they are. Here is where I get screwed up. Because an input parameter in the parameter manager is an output parameter here. Remember, the Excel system is a solver that takes in parameters from the parameter manager and send back values to drive your models. So in our example, all the dimensions and the volume are passed from the parameter manager TO excel, so they are input. The cost is passed from Excel to the Parameter manager so it is output.
Now you have hooked up your Excel system. Click on the “Return to Project” at the top of the window and you will go back to the project schematic and see that a Parameters cell has been added to the system and it has been attached to the parameter bar.
Go Into the Parameter Manager and Hook Things Up
Although the Geometry and Mechanical systems are connected through the parameters to the Excel system, no relationships exist. We need to assign some values to our Excel parameters.
This is what our test model looks like before we do this:
Our goal is to have the parameters in the first column below to drive those in the second:
P8: Solid Volume
I tried to just click on the value in value column (C) and change the value from the number it is to the parameter name it should be but that does not work. Because the parameter is set as a constant. So, you need to click anywhere on the row for the parameter you want to set, then go down to the Properties window and change the Expression to the Parameter ID you want to change. This changes the Expression to be an equation and the Expression Type to be Derived:
That is it. You now have Excel in your project as a solver. Update your project and the cost will be calculated and presented as a parameter for optimization, DOE studies or whatever you want.
Tell ANSYS to run a VB macro (if you want)
One really cool feature is that you can tell the program to run a VB macro on an update. What you do is go to your system, click on Analysis then RMB-> Edit Configuration. Then click on the file cell (A3). The property area now shows info on your file, and has a Use Macro row at the bottom. Click on the checkbox and a Macro Name row will popup. Enter the name of a macro in your spreadsheet and you are off.
Here is a silly example where I use a macro to calculate a value. For the example I put in the well known equation for deriving the Kafizle of a system:
Create a new row in my table for the Kafizle value to go in
Create a name WB_KF for the value
Write my macro (don’t laugh): Sub CalcKafizle() Range("E7") = Rnd(1) + Cos(Range("E5").Value) End Sub
Save my sheet and KABOOM. I now need to save it as an xlsm, not xlsx! I didn’t think about that!
This means my Excel connection wont’ work. So you have to delete your system and start again with your macro file. So plan ahead! I’m glad I did this silly example rather than running in to it on a real problem.
Once everything is right again, go into the outline for the excel system and make that new parameter (WB_KF) an output parameter.
Then click on the File (A3) and go to the properties window and click on the Use Macro checkbox
Put the macro name into the Macro Name field
Now you can run you project, and every time you do, the program will calculate a new cost and Kafizle value. This of course begs the question, what are the proper units of Kafizle? Here is the Design Point table:
Thoughts and Conclusions
I started this effort thinking I would drive my model from Excel, basically replacing the Parameter Manager with Excel. But that does not work because Excel doesn’t know enough about your project to handle the dependencies that can really cause problems if you don’t solve in the correct order. So once I figured that out I found some pretty good uses. Here are some other ideas for how to use the Excel System:
Do additional post processing on result values
Use formulas or lookup tables to calculate loads.
Just make sure that the values you take from your ANSYS model into Excel (inputs) are also input parameters in the parameter manager.
Use tables and lookups to calculate input values for an analysis
A good example would be a “family of parts” application where you put in a part number and Excel does a vlookup() on a table that has all the input parameters listed by part number.
To include results from an ANSYS analysis in a system model you have in Excel.
You still have to force the update on the ANSYS side, which is not the ideal way to run a system model, but it may be easier than writing scripts and hooking it up that way.
This is a new feature at R13 and it can be a bit “touchy.” Especially if you are rooting around in it like a Javalina rooting around in your flower bed (Arizona reference). If you do something really crazy it can loose its way and start generated errors. I found the best solution at that point was to save a copy of my Excel file, delete the system, and start over.
This took a lot longer than I thought to write, but the Excel System does a lot more than I thought. I think as we all start thinking about how to use this tool, people will come up with some pretty cool applications.
Getting ANSYS Workbench up and running on Linux at R13 is pretty simple. You just have to make sure that a few things are in place and some packages are loaded. Then it works great. Here is a quick HOW-TO on getting things going:
Install CentOS 5.3 or greater or RHEL5
Download and install the latest graphics card drivers for your video card. Restart
Next, Gnome Desktop Environment is required for optimum use.
Next, Using the Linux Package Manager. Select the Development main group and then select the additional libraries all needed. (see images below)
Select Optional packages and then select the additional MESA libraries (see below).
Next, select the Base System main group, then X Window system, and Legacy Software Support. With Legacy software support still selected. Click Optional Packages and select the additional package – openmotif22 and click close.
Restart the system
Post ANSYS Install Setup Tasks
Within your Terminal session. Type ProductConfig.sh
Click Configure Products, then select the products to configure or reconfigure
Pro/E Configuration GUI
Unigraphics NX install Configuration GUI
Click Continue and the product configuration script will run.
How to launching ANSYS Release 13.0 Workbench
Open a Linux terminal session:
Change your path to include /ansys_inc/v130/Framework/bin/Linux64
Next, launch the program by typing ./runwb2 and press enter
Basic opening up of a Design Modeler project
Here it is: ANSYS 13 Workbench on CentOS 5.5 64-bit Linux
I’m sure the question comes up for a lot of us from time to time, whether from one of our own offspring, another relative, or an acquaintance. “Just what is it that you do, anyway?” Typical answers might be something short and sweet, such as, “I’m an engineer.” A more detailed response might be, “I use a technique called finite element simulation which is a computer tool we use to simulate the behavior of parts or systems in their real world environment.”
You’ll probably find that people’s eyes glaze over and they start looking for someone else to talk to by the time you get to the end of that second quote above. In fact, I find that my extended family is much more interested in my brother-in-law’s surgery stories from the operating room than they are in my own triumphs and challenges in the engineering simulation world. Maybe you’ve had that same sort of reaction. You have probably noticed that there are a whole lot more medical dramas on TV at any one time than there are engineering dramas. They’ve got many characters from Marcus Welby on up to Dr. Ross on ER, Jack on Lost, to Dr. Grey on Grey’s Anatomy, with more than I can count in between.
We’ve got, well, Scotty. And even then I think Dr. McCoy got more air time.
So when my kids ask me what I do at work, I recall a scene from that late 1980’s to early 1990’s TV show The Wonder Years. In the episode “My Father’s Office,” Kevin asks his dad what he does for a living. His father responds in an angry tone, “You know what I do! I work at NORCOM.” As if that were a sufficient explanation. I suppose it was his way of saying, “It’s complicated. It can be high pressure. You might find it boring. It puts food on the table and a roof over our heads, though.”
Rather than reply that way, I’ve tried to come up with what is hopefully a better response. In fact, this concept constitutes the first portion of our Engineering with FEA training class, written by Keith DiRienz of FEA Technologies with contributions by yours truly.
I can’t guarantee that your audience’s eyes won’t glaze over by the end, nor that you’ll become the hit of the party, but this is free and you get what you pay for. This explanation can obviously be adjusted based on the audience, but it goes something like this:
–We have equations to solve for stresses and deflections in simply-shaped parts such as cantilevered beams.
–No such equations exist for complex shaped objects subject to arbitrary loads.
–So, using finite elements, we break up a complex part into solvable chunks, leading to a finite set of equations with finite unknowns.
-We solve the equations for the chunks, and that ends up giving us the results for the whole part.
If we want more details, we can use this: As an example, here is a simple beam, fixed at one end with a tip load P at the other end. We have an equation to calculate the tip deflection u for simple cases:
In the above equation E is the Young’s Modulus, a property of the material being used and I is the moment of inertia, a property of the shape of the beam cross section.
For more complex shapes and loading conditions, we don’t have simple equations like that, but we can use the concept by dividing up our complex shape into a bunch of simpler shapes. Those shapes are called elements.
A useful equation for us is the linear spring equation, F=Kx, where F is the force exerted on the spring, K is the stiffness of the spring, and x is the deflection of one end of the spring relative to the other. If we extend that concept into 3D, we can have a spring representation in 3D space, meaning the X, Y, and Z directions. In fact, the tip deflection equation shown above for the beam fixed at one end can be considered a special case of our linear spring equation, solved for deflection with a known applied force.
By assembling our complex structure out of these 3D springs, or elements, we can model the full set of geometry for complex shapes. The process of making the elements is called meshing, because a picture or plot of the elements looks like a mesh.
Using linear algebra and some calculus (stay in school kids!) we can setup a big series of equations that takes into account all the little springs in the structure as well as any fixed (unable to move) locations and any loads on the structure. The equations are too big to solve by hand by normal people so computers are used to do this.
When the computer is done solving we end up with deflection results in each direction for the corner points (called nodes) in each element. Some elements have extra nodes too.
From those deflection results, the computer can calculate other quantities of interest, such as stresses and strains. Further, other types of analyses can be solved in similar fashion, such as temperature calculations and fluid flow.
Here is an example using a familiar object that practically everyone can relate to. This plot shows the mesh:
This is fixed in the blue region at the bottom and has an upward force on the left end. The idea here is that someone is holding it tightly on the blue surface and is pulling up on the red surface.
After solving the simulation, we get deflection results like this:
The picture above shows that the left end of the paper clip has deflected upward, which is what we would expect based on common experience with bending paper clips. Using our finite element method, we can predict the permanent deflection resulting from bending the paper clip beyond it’s ‘yield’ point, resulting in what we call plastic deformation.
Clearly there is a lot more to it than these few sentences describe, but hopefully this is enough to get the point across.
In sum, not as exciting as my brother in law’s medical stories involving nail guns or other gruesome injuries, but hopefully this makes the world of engineering simulation a little more accessible to our friends and family.
In the Wonder Years episode, Kevin ends up going to work with his father to see for himself what he does. I won’t spoil the episode, but hopefully you’ll get the chance to show your family and friends what it is that you do from time to time.
During this weeks webinar on Using APDL Snippets in ANSYS Mechanical a question came up about coordinate systems. I actually don’t remember the original question, but in answering it the question came into my mind: how do you get access to the ID of coordinate systems that you create in ANSYS Mechanical?
For a lot of items you can add to the ANSYS Mechanical model tree, you can attach a Command Object (snippet) and ANSYS Mechanical passes a parameter with the ID of the thing you want access to (material, contact pair, spring, joint, etc…). But there is no way to add a Command Object to a coordinate system.
So I dug into it and found something I didn’t know. The problem with discovering something like this and sharing it is that you either just uncovered something that can help a lot of users or you are going to embarrass yourself over something that everyone already knows. The idea of a blog is to be casual and informal, so let’s see which I did.
If you click on a user created coordinate system in ANSYS Mechanical The Detail View list two things in the first grouping “Definition”. They are Type and Coordinate System ID. The default for the ID is “Program Controlled” I’ve never clicked on it to see what the other options are. It turns out you can change it to “Manual”
Once you do that it gives you a second “Coordinate System ID” line and you can put in whatever number you want there.
Problem solved. Just give your coordinate system whatever number you want and use that number in your macro. Couldn’t be easier.
Hopefully, this was helpful. If so, rate this posting at a 5.
If you already knew this little factoid, rate it as a 1.
As promised, this weeks The Focus posting will be a follow up on that webinar with an in-depth look at the scripts that were used in the example. But first, let us review what DA is for those that don’t remember, fell asleep, or don’t want to sit through 60+ minutes of me mumbling into the telephone.
Review of DA
Design Assessment is a tool in ANSYS Workbench that works with ANSYS Mechanical to take results from multiple time or load steps, perform post processing on those results, and bring the calculated values back into ANSYS Mechanical for viewing as contour plots. It was developed to allow the ANSYS, Inc. developers to add some special post processing tools needed in the off shore industry, but as they were working on it they saw the value of exposing the Application Programmers Interface (API) to the user community so anyone can write their own post processing tools.
You use it by adding a Design Assessment system to you project. In its most basic form, the default configuration, it is set up to do load case combinations. That in itself is worth knowing how to use it. But if you want to do more, you can point it to a custom post processing set of files and do your own calculations.
A custom DA is defined by two things. First is an XML file that tells ANSYS Mechanical what user input you want to capture, how you want to get results out of mechanical, what you want to do with the results, and how you want them displayed in your model tree. Second is one or more Python scripts that actually do the work of capturing what the user input, getting results, doing the calculation, and sticking the resulting values back in the model. Both are well documented and, once you get your head around the whole thing, pretty simple.
Right now DA works with Static and Transient Structural models. It also only allows access to element stress values. Lots of good enhancements are coming in R14, but R13 is mature enough to use now.
If that review was too quick, review the recording or the PowerPoint presentation.
A Deep Dive Into an Example
For the webinar we had a pretty simple, and a bit silly, example – the custom post processing tool took the results from a static stress model and truncates the stress values if they are above or below a user specified value. Not a lot of calculating but a good example of how the tool works.
Note, this posting is going to be long because there is a lot of code pasted in. For each section of code I’ll also include a link to the original file so you can download that yourself to use.
I always have to go over XML files a few times to figure them out. There is a lot of information, but only a small amount that you need to pay attention to. After a while you figure out which is which.
Now on to the fun part, the Python scripts. The first one gets executed when they user chooses solve.
da_trunc_solve.py is shown below and you can get the original here. The comments inside pretty much explain it all. It basically does two things: creates an ANSYS MAPDL macro that extracts all the element stresses and puts them in a text file, then it runs MAPDL with that macro.
Some key things you should note about this script:
You have to use MAPDL to get your stress values. Right now there is no method in the API to get the values directly.
You need a second MAPDL license in order to run this script. It does not share the license you are using for ANSYS Mechanical at R13. This should be addressed in R14.
One work around right now is to use an APDL code snippet in the ANSYS mechanical run that makes the text file when the original problem is solved. The SOLVE script is then no longer needed and you can just have an evaluate script. Not a great solution but it will work if you only have once seat of ANSYS available.
Note the directory changes and getting of result file paths. This is important. Mechanical does stuff all over the place and not in just one directory.
Make sure the MAPDL execution stuff is correct for your installation.
Once the solve is done and our text file, darsts.txt, is written, we can start truncating with the evaluate script (Original File). This script is a little more sophisticated. First it simply reads the darsts.txt file into a python array. It then has to go through a list of DA Results objects that the user added to their model and extract the stress value wanted as well as the floor and ceiling to truncate to. For each result object requested, it then loops through all the elements truncating as needed. Then it stores the truncated values.
91: # Get the element number and then grab SX values that
92: # was read from file
93: Element = Mesh.Element(ElementIter).ID()
95: if strscmp == "SX":
96: sss = sx[Element-1]
97: elif strscmp == "SY":
98: sss = sy[Element-1]
99: elif strscmp == "SZ":
100: sss = sz[Element-1]
101: elif strscmp == "SXY":
102: sss = sxy[Element-1]
103: elif strscmp == "SYZ":
104: sss = syz[Element-1]
105: elif strscmp == "SXZ":
106: sss = sxz[Element-1]
107: elif strscmp == "S1":
108: sss = s1[Element-1]
109: elif strscmp == "S2":
110: sss = s2[Element-1]
111: elif strscmp == "S3":
112: sss = s3[Element-1]
113: elif strscmp == "SEQV":
114: sss = seqv[Element-1]
116: # Compare to Ceiling and Floor and truncate if needed
117: if sss > strCeil:
118: sss = strCeil
119: if sss < strFloor:
120: sss = strFloor
121: # Store the stress value
123: # Go back to the original directory
Some things to note about this script are:
The same directory issues hold here. Make sure you follow them
Always loop on ResultGroups. You can assume the number of attributes is constant but you never know how many results the user has asked for.
In this example it is assumed that the stress label is the first attribute and that the floor and ceiling are the second and third. This is probably lazy on my part and it should be more general.
The way to make it more general is to loop on the attributes in a group and grab their label, then use the label to determine which value it represents.
Before you can store values, you have to create the result object and then each result value in that structure:
ResultStructure = ResultGroup.AddStepResult() for each result object the user adds to the tree
ResultValue = ResultStructure.AddElementResultValue() for each element
ResultValue.setValue(sss) to set the actual value
And that is our example. It should work with any model, just make sure you get the paths right for where the files are.
Be a Good Citizen and Share!
If you have the gumption to go and try this tool out, we do ask that you share what you come up with. A good place is www.ANSYS.NET. That is the repository for most things ANSYS. If you have questions or need help, try xansys.org or your ANSYS support provider.