News, Blogs and Updates

Upcoming PLM webinar for the simulation engineering community

This could be a good PLM webinar for the simulation engineering community. As PLM experts, our partners at Sherpa Design brought this to our attention and have recommended it. The webinar host, CIMData, is focused on the PLM market and provides multiple viewpoints into the PLM industry at large.

Their upcoming webinar will be on "Simulation Governance: Managing Simulation as a Strategic Capability." This topic could resonate with larger companies that have both engineering and analysis groups under the same roof.

February 11, 2016
11:00  EST | 08:00  PST | 17:00 CST
Register

Discrete Element Method (DEM) article published in FEA Information Engineering Chinese Journal

FEA Information Engineering Chinese Journal (a Chinese journal organized by LS-DYNA) recently prepared a special issue focusing on Discrete Element Method (DEM) and Smooth Particle Hydrodynamics (SPH). Predictive Engineering’s Adrian Jensen and Kirk Fraser were honored to have their DEM paper translated and published in the journal! Thanks to Hailong Teng for translating the paper while maintaining the original authorship.

A copy of the issue can be downloaded here, and to subscribe to future additions, send an email with the subject line “Subscribe” to [email protected].

LS-DYNA: Observations on Explicit Meshing

This is the 3rd in a series of informal articles about one engineer’s usage of LS-DYNA to solve a variety of non-crash simulation problems. The first was on LS-DYNA: Observations on Implicit Analysis, the second was on LS-DYNA: Observations on Composite Modeling and the fourth was LS-DYNA: Observations on Material Modeling.

I come from a background in implicit analysis where element quality can often be swept under the rug by the use of dense meshes and since models run quickly no one really cares about model size, i.e., ten million DOF. Whereas, what I enjoy about explicit is that it demands the upmost model preparation from the choice of element types to the creation of perfect quad and hex dominant meshes having the absolute minimum number of DOF.

What I have been noticing over the last couple of years in the explicit world is the creation of gigantic meshes that are justified by saying, “It runs just fine in four hours using 32 CPU-cores.” Although it runs, I wonder how much time was spent in debugging this beast, and also whether the mesh density was justified by experience or by the economy of using an off-shore meshing service.

What’s the Point?

Pages

Our Partners

Siemens PLM Software
Livermore Software Technology Corporation
Applied CAx
Sherpa Design
ENHU