<?xml version="1.0" encoding="utf-8"?>
  <rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom">
    <channel>
      <title>Design &amp; Simulation</title>
      <link>https://blog--3ds--com.apsulis.fr/topics/design-simulation/feed.xml</link>
      <description>Design &amp; Simulation</description>
      <lastBuildDate>Thu, 05 Mar 2026 16:10:06 GMT</lastBuildDate>
      <docs>https://validator.w3.org/feed/docs/rss2.html</docs>
      <generator>3DExperience Works</generator>
      <atom:link href="https://blog--3ds--com.apsulis.fr/topics/design-simulation/feed.xml" rel="self" type="application/rss+xml"/>

      <item>
      <title>
      <![CDATA[ Unlocking the Potential of CATIA in the Building Industry ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/catia/unlocking-the-potential-of-catia-in-the-building-industry/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/275081</guid>
      <pubDate>Thu, 12 Dec 2024 13:39:13 GMT</pubDate>
      <description>
      <![CDATA[ Antoine Duphil Consultant CAD/PLM at TECHSO interviewed by Jonathan ASHER CATIA Construction Sales Director on how his company use 3DEXPERIENCE CATIA
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 

Discover how tailored features and innovative solutions in CATIA are revolutionizing project management in the building industry.








Understanding Antoine&#8217;s Role in the Industry



The building industry, with its intricate processes and complex project management needs, greatly benefits from the expertise of professionals like Antoine. As a CAD/PLM Consultant at TECHSO Solutions in Montreal, Antoine focuses on optimizing the control of the life cycle of building components. His role involves deliberating with stakeholders to identify the most effective practices to enhance reliability across projects.



Antoine&#8217;s approach is multifaceted. He listens, understands, and analyzes the needs of various architecture and engineering firms. This understanding allows him to customize functionalities that align with users&#8217; daily activities. The goal? To create seamless and effective workflows without imposing rigid constraints.








Harmonization of processes, unification of methodologies, while maintaining flexibility, is key.








Flexibility is crucial, as it enables Antoine and his team to adapt to diverse inputs from architects and engineers, ensuring that every project is uniquely catered to.



The Transformative Benefits of CATIA



CATIA is more than just a tool; it&#8217;s a transformative platform that significantly enhances project outcomes. Antoine utilizes applications such as CATIA Building 3D Design and Assemblies, alongside CATIA Visual Scripting, to customize automation and integrate proprietary knowledge into existing features.



Three primary benefits emerge from using these applications:




Intellectual Property Features: These features save time and allow users to explore various scenarios, aiding in both creation and modification processes.



Propagation of Impact: With an up-to-date mockup, the propagation of impact is streamlined, reducing surprises by identifying direct interfaces between products.



Material Efficiency: By linking final products to raw materials, CATIA helps transform waste into salvaged material, significantly reducing consumption on construction sites.





CATIA allows us to transform waste material into reusable resources, minimizing construction costs.








These benefits underscore CATIA&#8217;s role in delivering efficient, sustainable, and innovative solutions that align with industry needs.



Evolving Features for Tailored Solutions



CATIA&#8217;s evolution is a testament to its commitment to meeting user needs. Antoine appreciates how the platform&#8217;s features are increasingly tailored not only to functional requirements but also to the cultural and linguistic nuances of users. This adaptability ensures that CATIA remains relevant and effective in diverse contexts.



A standout feature is the ability to embody different &#8220;truths&#8221; for a single object. This means users can switch between detailed and basic views as needed, providing precise information when required.




&#8220;CATIA&#8217;s evolving features match both the needs and culture of its users, offering multiple truths for a single object.&#8221;












This flexibility is a game-changer, enabling users to access the right level of detail for any given task.



Conclusion



Antoine&#8217;s insights highlight CATIA&#8217;s transformative impact on the building industry. By tailoring functionalities to user needs and leveraging innovative solutions, CATIA is redefining project management. The platform&#8217;s evolving features, combined with Antoine&#8217;s expertise, ensure that building projects are more efficient, sustainable, and adaptable than ever before. As CATIA continues to evolve, it promises even greater alignment with the dynamic needs of the industry.




CATIA is not just evolving; it&#8217;s reshaping how we approach building projects, making them more efficient and sustainable.








Discover more on the free online CATIA Buildings and Infrastructure community: thousands of tutorials and CATIA experts!
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Benefits and Challenges of Using Big Data in Resource Estimation ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/benefits-and-challenges-of-using-big-data-in-resource-estimation/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/275094</guid>
      <pubDate>Thu, 12 Dec 2024 13:23:32 GMT</pubDate>
      <description>
      <![CDATA[ The good news is that advances in data acquisition technologies mean that a whole new world of soft data – from downhole geophysics, multi- and hyper-spectral core scanning, and more routine collection of geometallurgical parameters – is now available to inform and enhance resource modelling and estimation.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Authored by Michael Mattera, GEOVIA Industry Process Consultant Senior.



To estimate the properties of a mineral deposit as reliably as possible — especially as economic orebodies around the world become increasingly complex — a geologist must thoroughly understand the deposit as well as the method of emplacement/mineralisation.



And the only way geologists can do that is through using sound, dependable data.



Yet traditionally mining companies have relied solely on two exploratory drilling methods to obtain the physical samples, typically the only working data, that resource geologists use to model and estimate mineralisation:




diamond drilling, which involves withdrawing small diameters of core rock for analysis



reverse-circulation drilling, which involves collecting crushed rock cuttings for analysis.




The result is that billion-dollar decisions are based on the physical analysis of a very small amount of material while the bulk of the material to be mined, both overburden/waste and the mineralised orebody itself, remains unexamined.



To ensure higher quality resource estimates, geologists need more data.



The good news



The base data for geological modelling and resource estimation can be classified as either hard data (data that is directly observed and measured), or soft data (which make up the bulk of what’s known as ‘big data’) from other sources.



The good news is that advances in data acquisition technologies mean that a whole new world of soft data – from downhole geophysics, multi- and hyper-spectral core scanning, and more routine collection of geometallurgical parameters – is now available to inform and enhance resource modelling and estimation.



For example, using soft data can help resource geologists detect correlations between variables that might not be immediately obvious from hard data alone, such as a subtle alteration pattern that is evident from hyper-spectral core scans but not in assay results. Including additional geometallurgical-related parameters, such as hardness or grindability, acid consumption, moisture content, or clay minerals, can also:




highlight potential processing issues or abnormal values that wouldn’t be recognised with a more limited dataset



help define trend surfaces, such as gradual changes in mean values that can be removed from the data to improve the quality of estimates



identify variables to be estimated that might not normally be included in the block models that represent the material to be mined.




Using geometallurgical and other parameters – through using self-organising maps, for example – also contributes to better domaining of the mineralisation. This is because it allows geologists to consider many more characteristics as they define which volumes of material share similar characteristics and which are distinct.



In addition:




Adding in big data — via techniques such as co-kriging using secondary variables — can help produce estimates that take more localised (at a selected mining unit scale) variations in the mineralisation into account, while still:



achieving acceptable slope of regression (a standardised measure of the quality of the estimates)



minimising conditional bias (true value is typically less than the estimate when the estimate is high, and the true value is greater than the estimate when the estimate is low).



If a mining company chooses not to complete, for reasons of time or money, a full analysis of all the attributes of all physical samples, geologists can use big data to fill in (impute) missing values using estimation techniques, proxy formulas, or correlations. Once they have all desirable attributes available for each sample, they can then return to more conventional techniques, such as kriging, to produce estimates or simulations — a set of equally probable realisations of the estimates — for all required parameters.



Incorporating big data as part of the resource modelling and estimation workflow increases the ability for resource geologists to:



highlight areas of higher risk (with, for example, elevated levels of deleterious elements or material with potential processing problems) that could be subject to additional environmental or social considerations



adopt the industry best practice scorecard approach to the classification of the Mineral Resource estimates (from low to high confidence: Inferred, Indicated and Measured)



improve mine site safety by identifying zones that might have poor ground conditions or require a change in standard mining practices to deal with (thereby introducing non-standard or unexpected behaviour).




The bad news



At the same time, however, all this additional information can result in an overload of big data, potentially many terabytes in size, that might also have varying degrees of accuracy and which must be separately validated before it can be used. 



That validation can add substantial time and effort, since the new, non-traditional data must be made to work with — and be stored and visualised alongside — the traditional physical drilling information, such as lithologies and assays, typically found in a geologist’s resource database. Finally, big data also makes (potential) automatic modelling and simulation a complex, processing-intensive task.











So what should mining companies do?



With clear advantages to using big data, despite what it demands in time and effort, miners need to keep in mind that a poor dataset will always produce a poor estimation. A good dataset, taking into account all available data, will produce an estimation that is more statistically sound, with clearly defined reasoning behind each of the decisions made along the way.



To make the best use of all available data, mining companies should consider how they want to address four specific challenges:



1. Storage



The process of acquiring, validating, and analysing the base data for resource estimation is time consuming and expensive, which means miners must consider the value of the information and knowledge derived from that data when determining how they will store it.



They must also decide how long to store it for: it may take years or even decades before a company makes the decision to mine, while the mining operation itself can take place over decades, so the lifecycle of the data is also long. But even data that is decades old can remain valid and useful for analysis/modelling if it is appropriately stored and, most importantly, still available.



Currently, however, geologists often store the initial data they collect during the exploration phase on a laptop, which both limits access to this data by other project teams and increases the risk that the data — and its potential value — could be lost at any time if a geologist changes roles or the device is retired.



2. Multiple sources



Geologists need to be able to retrieve and use information easily, but the sheer range of data now available for geological modelling and resource estimation can make that difficult.



Today, base data comes in a wide variety of types, including lab results supplied directly from Laboratory Information Management System (LIMS) systems and descriptions of the diamond drilling core from which the physical samples were extracted, as well as data from, for example, hand-held/portable X-ray fluorescent (XRF) analysers, data historians that record drilling penetration rates, and metadata — ie those additional details, such as the time of day the data was collected and the person, company, or piece of equipment that collected the data, that are vital for confirming if the data is in the original form, if it has been manipulated or adjusted, or is a calculated average.



This leads to a dataset made up of a diverse collection of text files, Excel spreadsheets, and resource models in proprietary binary format files, alongside data stored in geoscientific information management software packages and core scans, which alone can take terabytes of data, with much of it collected at different times and by different people/equipment.



3. Data lifecycle



Large amounts of data from multiple sources acquired over many years increases challenges to both data domaining (dividing the rock mass into volumes with similar characteristics that are distinct from each other) and the Mineral Resource classification process.



Resource geologists must now consider the lifecycle of the data used in resource classification and find a way to accommodate and flag drilling results and other data with lower confidence (or which failed the validation test), while not losing portions of that dataset, such as lithological/structural interpretations, that could still be used for resource modelling purposes.



Also, as more data is collected, geologists may deem historical data with no or inappropriate quality assurance/quality control (QA/QC) less reliable for use in mineral resource estimation, and must have a way to incorporate this finding into the database to ensure only the highest quality data is used. For example, if newer, more accurate collar/downhole surveys or laboratory analysis methods identify weaknesses in previously collected data, that new data could make the use of historical data (such as lithological contact positions or assay information) inappropriate, depending on how the data is used in the resource definition and estimation process.



The same might happen with biased historical data. Bias usually only becomes apparent and downgrades confidence in the data after a considerable period of time. It is crucial to maintain all metadata so that the data does not have to be revalidated before it is used in each resource update cycle.



4. Database management



In order to manage a resource estimation dataset that includes an array of big data properly, resource geologists need to be able to:




Discriminate between hard and soft data and any metadata that also needs to be included in the resource dataset, and to store their reasons for considering the data suitable for estimation or not.



Maintain the integrity of the resource dataset to ensure that the level of confidence (low to high) in the data can be used to appropriately:



classify the confidence level of the resource estimates



determine the risk profile of the decisions based on those estimates.



Control access to the database to:



ensure that only validated and approved data (as opposed to raw data on which the QA/QC has not been verified) is used in the resource estimation process



identify where other data has been confirmed as suitable only for modelling the geology (such as the extent of the mineralised lithologies) as opposed to estimating the mineral content or other properties of the material to be mined, including waste/overburden.



Provide proof of a strong chain of custody for all data that will confirm, for example, that assay data has not been manipulated. This proof will increase confidence in the estimates during external independent reviews, and illustrate that the dataset is being well governed — a vital consideration for financing.




The future is in the cloud



While these are significant challenges, they are not insurmountable, and the future for resource modelling and estimation is, in my opinion, in the cloud.



A cloud-based platform:




removes storage limitations and allows for on-demand access to both data and processing power



ensures high availability, which can replace current back-up and disaster recovery processes, except for those that are time-sensitive and can affect mining/production



provides a central location to store and share all data, with sufficient on-demand computing resources available to accommodate repeatable workflows rather than a collection of independent, difficult to back-up processes run on separate devices with limited processing power



offers the option of using standardised workflows to capture deep specialist knowledge, which then becomes permanently retained, role-based knowledge



enables processes that depend on access to powerful computing resources to be run more efficiently, both in time and cost, than using local machines with limited capabilities — for example, with a cloud-based computing platform, it becomes much more feasible to routinely undertake valuable studies when new data becomes available, such as simulating variations:



in the geology model and the resource estimates, and then building multiple mine plans based on these variations, or



in the beneficiation process when handling ore with differing chemical characteristics or ratios of ore types



makes it possible to:



quickly incorporate artificial intelligence and machine learning techniques in workflows to automate a number of time consuming and/or repetitive tasks



construct workflows to produce financial models that incorporate much more of the underlying inherent variability of the mineralisation as opposed to those based on average assumptions — making true risk-based decisions using robust confidence intervals placed on key metrics, such as Net Present Value or Internal Rate of Return, possible.




In short, by being able to store, process, integrate, share, and display all available data types required for high-quality resource modelling and estimation, a cloud-based platform will contribute to overall improved orebody knowledge and understanding of the controls on mineralisation.



This in turn will result in significant downstream benefits, including better blending of material for processing, more consistent plant throughput, and ultimately, most importantly, higher product quality and increased profit.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Building Efficient and ESG-Compliant Mines with Virtual Twins ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/building-efficient-and-esg-compliant-mines-with-virtual-twins/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/275082</guid>
      <pubDate>Thu, 12 Dec 2024 13:13:45 GMT</pubDate>
      <description>
      <![CDATA[ In order to supply demand, the mining industry must discover more copper deposits, increase inventory, and accelerate production. The challenge lies in developing mines quickly while adhering to ESG norms, environmental regulations, and community stakeholder buy-in. Integrating technology across the mining value chain can reduce uncertainty and expedite this process. 
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
By Gustavo Pilger, Director &#8211; Worldwide GEOVIA Research and Development Strategy and Management, Dassault Systèmes.



Fossil fuels currently meet about 75% of global energy needs. To transition to renewable energy sources by 2050, significantly larger quantities of minerals, especially copper, must be mined.&nbsp; The current rate of production will not suffice to meet demand. With only 800 million tons of known copper reserves, is it possible to supply this demand? If it is possible, can it be done it with minimal environmental impact while adhering to ESG norms and public scrutiny?



In order to supply demand, the mining industry must discover more copper deposits, increase inventory, and accelerate production. The challenge lies in developing mines quickly while adhering to ESG norms, environmental regulations, and community stakeholder buy-in. Integrating technology across the mining value chain can reduce uncertainty and expedite this process.



Avoiding silos in mining sub-processes with technology



Processes and sub-processes in mining typically occur in silos, leading to frustration among stakeholders due to inefficiencies that often lead to disruption and rework. Engagement is key to foster collaboration that can break down silos and bring teams together toward achieving common goals. It is important to communicate effectively in order to take stakeholders on a journey that results in common understanding of contexts, objectives, constraints, risks, planning, and risk mitigation measures to achieve a given target. This could be greatly facilitated if all stakeholders work in a collaborative manner on a unified platform with consistent data models and user experience.



Mining could be seen as a series of interconnected systems, or a ‘system of systems,’ that includes several processes from securing permits and exploration down to development, production, beneficiation, sales and distribution, decommissioning, and site rehabilitation. How do you connect systems, technology and people to ensure this system of system “machine” works in a synchronized and harmonious manner while constantly chasing value and common KPIs? A first logical step is to clearly delineate the boundaries of these systems and identify how data flows or how it should flow within them.



These system of systems framework must recognize a constellation of systems and must understand that the output of one system is the input of an adjacent one. This framework must also be aware of the decisions made within each system’s boundaries as well as the implications or consequences to upstream and downstream processes. Virtual twin experiences provide this framework.




“Dassault Systèmes’ solution for sustainable, energy-efficient mining that meets environmental norms and revenue targets centers around virtual twin experiences.”








Virtual twins help embed sustainability in mining operations











The virtual twin provides a live, virtual replication of the real world, where processes and systems are interlinked and associated with one another. This includes the underlying data that informs and describes these processes, all interconnected from a multiphysics, multiscale, and multidisciplinary perspective.



With virtual twin experiences, associated data and intelligent methods help mining organizations pursue value throughout their operations, adapting to uncertainty and unplanned events of technical, mechanical, or market origins.



Can you change the way you operate by designing mines that extract more minerals more efficiently, while minimizing energy use, and complying with ESG standards and environment norms? Yes, the virtual twin can assist in this transformation. It can model and simulate many possible outcomes, environmental and social scenarios, balancing efficiency and costs with ESG norms, and while maximizing safety and value.




“By connecting various data and systems, the virtual twin creates a unified collaborative environment, allowing stakeholders to identify priorities and measure performance against benchmarks for responsible and sustainable growth.”




The virtual twin also enables the management to control permit status, asset agreements, asset licenses, and associated cost analysis to ensure that everything proceeds according to plan.



The virtual twin ensures continuity between the natural environment, claim boundaries, and built infrastructure. Users can leverage immersive visualization with spatial contextualization to gain a comprehensive view of data for actionable insights. This ensures sustainability concerning energy and emissions, water, and biodiversity. Environmental, social, regulatory, and sustainability KPIs can be measured against benchmarks using the virtual twin.



The virtual twin also provides visibility into data with powerful integrated analytics and geospatial data. It allows the aggregation and propagation of data on land stewardship in line with company, compliance, and regulatory frameworks.




“Digital communities created through the virtual twin facilitate two-way data and model sharing, ensuring a unified user experience for all stakeholders.”








Why virtual twins are essential for energy efficiency in mining



Mining is one of the most power-intensive industries. Collectively, mining consumes about 11% of the world’s energy, primarily from fossil fuels. Therefore, powering mining and related infrastructure with renewable energy sources is essential for sustainability. Renewable energy also offers opportunities for cost savings, innovative mine designs, and resilience against uncertainties.




“With the virtual twin, mining energy systems can be designed as a single platform, allowing for the design and simulation of energy supply and sources.”




Many mine operators are likely to adopt a hybrid approach to energy. The virtual twin supports a hybrid approach to energy efficiency in mining by using advanced controls that enable miners to design, simulate and visualize optimal energy configuration regimes while reducing overall operating costs, therefore de-risking the electrification process.



In short, virtual twin experiences should play a major role for building the next generation of copper mines and to attract an ever-increasing digital-native workforce in order to achieve sustainability goals and ESG mandates. Virtual twins are not only essential for de-risking mining projects from a multidisciplinary perspective but also for navigating business complexity while ensuring sustainability. Ultimately, virtual twin experiences are a fundamental communication tool for engaging communities where mines operate as well as governments, regulatory agencies, shareholders, and employees from the outset while fostering sustainable mining projects with shared interests.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Improving Geological Modelling in the Age of Data Overload ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/improving-geological-modelling-in-the-age-of-data-overload/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/275067</guid>
      <pubDate>Thu, 12 Dec 2024 12:06:05 GMT</pubDate>
      <description>
      <![CDATA[ However, while today’s big data and advanced technology can help geologists generate more accurate models than ever before, there are still challenges to overcome in using all that data and technology successfully. Fortunately, there are also solutions.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Authored by Jacques Nel, GEOVIA Senior Industry Process Consultant.



Geologists face many challenges in dealing with the unprecedented amount of geoscience data available today. In this article, Jacques Nel discusses software developed by Dassault Systèmes with the potential to help mines to rise above the chaos and use all of their data to generate more accurate geological models that give them a critical strategic edge.



A geological model is a specific representation of the location, characteristics, and extent of the lithology and ore types of a mineral deposit. It is based on the knowledge of the geologist and on data gathered through such sources as geological field observations, drillhole records, geophysical surveys, and assays. It is a vital input for both the resource model and mine planning, and affects virtually every decision made throughout the mining process.



To put it another way, the success or failure of a mine largely rests on the shoulders of the geologist responsible for the original geological model.



However, while today’s big data and advanced technology can help geologists generate more accurate models than ever before, there are still challenges to overcome in using all that data and technology successfully. Fortunately, there are also solutions.



Challenge 1: Importing and managing the data required for an accurate geological model



Geoscience data is key to any technical or financial evaluation of a mineral deposit because it helps define the site, shape, and grade of the orebody.



Today, however, geologists are inundated with so much new geoscience data — structural, geochemical, lithological, remote sensing, etc. — that they often don’t know what to do with it all. At the same time, valuable historical data may be stored on a number of different servers, on staff members’ personal laptops, on outmoded disks or drives, or even in long-forgotten physical filing cabinets. There may also be so many different versions of some datasets that it’s impossible to know which is the latest.



All of this can lead to a lack of confidence in the data used for modelling and downstream decision-making processes, and/or to mines using valuable time and money to re-explore or re-acquire lost data. It also makes it difficult for mines to comply with Environmental, Sustainability, and Governance (ESG) regulations that call for them to become more transparent in how they acquire, store, manipulate, and use geoscientific data.



Challenge 2: Visualising and analysing the data required for an accurate geological model



Data visualisation gives geologists insight into their geoscience information and helps them identify geological trends or patterns as well as erroneous data. Data visualisation, however, like data management, has been made much more complicated by the vast quantities of data available today from both exploration and operational projects — most of it in different data types from different sources in a variety of file formats.



Those different sources and formats make it difficult for geologists to visualise and analyse their data in one application, meaning they could miss important trends or patterns, while the workflows they have access to now to integrate separated data may be too time-consuming or unreliable to use. The result is that valuable geoscientific data may be inadvertently excluded from the geoscientific analysis, reducing the accuracy of the geological model.



Challenge 3: Interpreting and modelling the geology of a mineral deposit



Throughout much of the world, regulatory bodies require geologists to adhere to strict standard operating procedures and regulations when generating geological models. This is not easy considering:




the number of tasks that must be completed to generate a model



the amount of data linked to each task



the timeframe allocated for each task, and



the complexity of keeping track of tasks, data, and timeframes.




But failing to adhere to these operating procedures and regulations is not an option, because it could lead to non-compliance, which would have a negative impact on the mine and/or mine company.



Having data stored in multiple repositories in many different file types compounds this challenge by forcing geologists to interpret data in several separate applications, which once again (see Challenge 2 above) may lead to them missing critical trends or patterns, and then producing incorrect interpretations and inaccurate geological models.



Challenge 4: Validating and sharing a geological model



Once a geologist has interpreted geoscience data and created a base geological model, there are a number of different methods they can use to ensure it has been generated as accurately as possible. These include:




doing a geostatistical simulation to construct geological scenarios that can test or quantify the uncertainty of the model



conducting a visual analysis of the domains compared to the actual data



using query filters to test the model



making a direct comparison of the new geological model to the previous model, and/or



sharing the model across the organisation for collaborative peer review.




The challenge for many geologists right now, however, is that they may have only a single technique or application they can use to validate their model, which means they may miss certain issues that can render the model — and all the critical mine planning decisions made based on it — inaccurate. This in turn can lead to imprecise mining optimisations, possible safety issues, and less-than-expected production.



Sharing the model for peer review may also be difficult. Departments in the mining industry have traditionally acted in silos, which has made cross-departmental collaboration challenging. 



Many also still use email or flash drives to share important files, which makes it hard to be sure which version is the latest, and not all departments may have access to applications where they can view or statistically analyse the model, and acquiring new licences can be costly.



Addressing the challenges



Overcoming all four of these challenges — the need for secure, effective data management; comprehensive geological visualisation and analysis; correct data interpretation; and accurate model validation — begins with ensuring that all geoscience data is integrated, stored, analysed, interpreted, and managed in a single, platform-based centralised repository.



This centralised repository will ensure that the latest, most complete version of any data becomes the single source of truth for everyone to reference. 



And it has the added bonus of helping to minimise the financial cost of obtaining geoscience data by increasing the percentage of it actively used in creating models.











One example of a platform-based centralised repository is Dassault Systèmes 3DEXPERIENCE platform. To assist with geological modelling, we have also developed a customised Geology Modelling repository, where all of a mine’s geoscience data is first securely stored on the platform, either on a mine company’s own premises or on a public or private cloud. 



From this central hub, industry-proven applications, tools, and workflows make it far simpler for geologists to locate, interpret, display, and analyse their data, as well as to create, validate, and share geological models.



It works like this: The 3DEXPERIENCE platform connects to both GEOVIA Surpac geology and mine planning software and ENOVIA project and document management software. 



This enables geologists to drag and drop any data (drillhole, topographical point cloud, geophysical, assay, geotechnical, etc) held on the platform in any format into the GEOVIA Surpac graphics window to begin work immediately — no data conversion or lengthy import or export processes required. 



This kind of integration also has the benefit of automatic document versioning, data check-in/check-out, and user file and folder permissions to ensure traceability and accountability.



The software combination also gives geologists the ability to:




synchronise data for fast 3D visualisation



apply geological reasoning and logic to sculpt domain solids from all available data



create and compare various interpretations for “what if” scenario analysis to quantify the natural uncertainty



track the entire evolution of the geology model from data interpretation to model generation and validation



create a project plan, assign tasks, and monitor project progress (and any issues or bottlenecks) against the plan using a variety of visual methods, and



share the 3D geological model and any statistical reports and charts with stakeholders through customisable dashboards or communities.




The result is improved compliance with Standard Operating Procedures, ESG regulations, and the mine’s own KPIs. It also allows greater confidence in the quality of the geoscience data and the accuracy of the geological model, and better decision-making throughout the life of the mine.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Debugging Abaqus Models ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/simulia/debugging-abaqus-models/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/264821</guid>
      <pubDate>Thu, 12 Dec 2024 08:00:00 GMT</pubDate>
      <description>
      <![CDATA[ This blog is a comprehensive exploration of the process of debugging models in Abaqus/Standard, with a specific focus on resolving convergence issues. It provides a detailed comparison between Abaqus/Standard and Abaqus/Explicit, using iterative methods and strategies, highlighting the importance of understanding model features, adopting systematic approaches, and maintaining perseverance in the face of issues. 
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
The following blog was authored by Ritu Singh, a mechanical engineer from Auburn University, Alabama, with six years of experience. She excelled as a senior design engineer for three years before transitioning to a writer and team lead for three years, specializing in creating accessible content. Currently, Ritu serves as an Advocacy Offer Marketing Specialist in Global Marketing at Dassault Systèmes for the SIMULIA brand, where she combines her engineering acumen with her writing skills to craft compelling marketing content.



Introduction



Debugging Abaqus/Standard models can be complex, especially when confronting convergence problems in structural simulation.



This blog focuses on debugging Abaqus models, which refers to fixing convergence problems in Abaqus/Standard. Debugging can refer to a range of things, from building and fixing problems in a mesh to correcting basic modeling mistakes, misspelled keywords, and more. However, this blog aims to learn how to fix convergence problems in Abaqus/ Standard.



FAQs




Q: How can I find examples that use a certain feature in Abaqus to familiarize myself with its usage?




A: You can search the online documentation or use the Abaqus findkeyword utility to find examples that use a certain feature. The input files associated with these examples are provided as part of the Abaqus installation, which you can fetch using the Abaqus fetch utility.




Q: What is the Abaqus Verification Guide, and how can it help me learn to use a new capability?




A: The Abaqus Verification Guide contains test cases that prove that implementing the numerical model produces the expected results for one or several well-defined options in the code. Running these problems when learning to use a new capability can help ensure you use it correctly.




Q: What is the Job Diagnostics tool in Abaqus/CAE, and how can it help me debug my model?




A: The Job Diagnostics tool in Abaqus/CAE allows you to monitor the progress of your analysis job and understand the convergence behavior of your job. It provides detailed information about each step, increment, attempt, and iteration of the analysis, which can help you identify and correct issues with your model.




Q: What is the difference between average force and time average force in the Residuals tab of the Job Diagnostics dialog box?




A: The average force is the force applied to the model at a given time step, while the time average force is the average value of the force over the entire analysis. The Residuals tab of the Job Diagnostics dialog box displays the values of both quantities for each iteration, which can help you identify the cause of convergence issues.




Q: What is the Getting Started with Abaqus plug-in, and how can it help me run Abaqus examples?




A: The Getting Started with Abaqus plug-in is a tool that allows you to run the examples described in the Abaqus documentation. It creates a model and a job for each example, which you can submit for analysis in the Job module and view the results in the Visualization module. The plug-in also fetches the input files associated with the examples and places them in the current directory.



Understanding Convergence in Abaqus/Standard vs. Abaqus/Explicit



Abaqus/Standard is the original Abaqus solver code, dating back to the early 1980s. It is a finite element solver code, also known as an implicit solver, with many capabilities. These range from general nonlinear static and dynamic simulations to linear simulations, including linear dynamics, heat transfer, acoustics, piezoelectric effect, and more.



Abaqus/Standard uses an incremental, iterative approach for general simulations. It is built around the Newton-Raphson method, a numerical technique used to target convergence issues, requiring a comprehensive debugging strategy.



This method’s successful completion results in so-called ‘convergence,’ while its failure leads to non-convergence. It is crucial to distinguish between these two states to address non-convergence effectively.



In contrast, Abaqus/Explicit is a dynamic explicit package released around 1992. It is an explicit solver that operates on a fundamentally different solver technology. Abaqus/Explicit has various capabilities such as general nonlinear dynamic simulations, heat transfer, and acoustics coupled with structural, large deformation methods and robust contact algorithms suitable for complicated 3D contact models.



A dynamic explicit analysis package, Abaqus/Explicit, does not use the Newton-Raphson method and, therefore, does not have convergence issues. However, explicit codes can face numerical stability concerns.



Users facing severe convergence problems in Abaqus/Standard can consider switching to Abaqus/Explicit. Since the interface for both the solvers is notably similar, transitioning to Abaqus/Explicit can help mitigate convergence challenges encountered in Abaqus/Standard simulations.



Example: Connector Spring



The first example involves a one-element model with one connector element to which a force is applied. A nonlinear spring stiffness balances the force. The connector element is a Cartesian-Cardan connector. Its first component of relative motion has a nonlinear stiffness, while the other components of relative motion are constrained.



Since the spring is nonlinear in one direction, this model is relatively straightforward and uses a static load step. This means we are going to test how far the spring stretches under the load. It is crucial to identify the critical model features when debugging a convergence issue:




Connector element with a nonlinear stiffness



Boundary conditions, including connector motion constraints



Concentrated loads



Static procedure.




We must understand the features in the model even if debugging is not required. Understanding how to use the model features is as crucial as understanding the problems they can cause such as,




Nonlinear springs may have non-monotonic force/deflection behavior



Multiple constraints might interfere with each other.



Follower loads may cause the need for the unsymmetric solver.



The simulated process may be quasi-static, causing the static analysis&#8217;s failure.




Knowledge of potential pitfalls is developed through both training and experience. This is true for almost any complicated human endeavor. Through training, one can circumvent relying solely on experiential learning while enhancing one&#8217;s ability to identify and address challenges effectively.



In the following example, the model fails to complete as-is. The first thing to do is look at the status file. (You may prefer to use Job Diagnostic in Abaqus/Viewer, but I prefer text files.) The green text indicates a convergence problem.











My log file contained an error code message, but these are lower-level, system-related problems that I would need to investigate separately. Here, we are facing a run-of-the-mill convergence problem. Once we have identified the issue of the analysis abruptly terminating, we must examine the message file.











I prefer to first inspect the end of the file. I like to do this while I have the ODB open in Abaqus/Viewer with the deformed shape of the last saved result displayed. In this case, the message indicates that the required time is less than the minimum specified, which is a generic message.



Often, error messages reiterate what&#8217;s already known. When I review the message file, I typically have a picture displayed of the model&#8217;s deformed shape just before the error occurred. However, since this is a single-element model, there&#8217;s not much to visualize.



Now that I know the excessive cutbacks caused the error, I can backtrack through the message file, noting the critical nodes and elements and locating them in the deformed mesh. I need to identify patterns in the numbers so I ask questions to myself, such as:




Does the same group of nodes consistently have the largest residuals?



Does the same group of nodes consistently cause problems with the contact?



Are the largest corrections at the same few nodes?



Is the plasticity seemingly out of control in the same group of elements? Etc.




The number patterns can help you identify the region of the mesh that is causing the problem. You can quickly locate these entities within the displayed mesh.



Now, with the job interrupted abruptly; hopefully, you have some intermediate results saved. You must formulate a hypothesis to mitigate the issue. This hypothesis must align with the identified problematic mesh region and potential issues inherent to the model’s features.



You must scrutinize known results through animations or contour plots of stress and displacement to refine this hypothesis. With a working hypothesis, the next step involves modifying the model to address the hypothetical problem.



With luck, once you implement the fix, the issue will be resolved. However, if the problem persists or new complications arise, an incorrect hypothesis could be the cause. In such scenarios, especially when dealing with convergence problems, it is crucial to practice perseverance.



In this case, the problems are observed at node two. This is expected as out of the two nodes, this is the only free-moving node. The mesh is in equilibrium, but there is a problem finding another equilibrium state with a slightly larger load. This leads me to hypothesize that there is something odd about the nonlinear stiffness.







To gain further insight, we would need to implement a technique promoted in training: displacement-controlled loading. Instead of applying a load, we can stretch the spring with an applied boundary condition and observe the results. This model has simple force-controlled loading, which can be easily converted to displacement control. By applying a fixed displacement, we can modify the model and successfully complete the analysis.







The force versus displacement plot shows that the non-monotonic force-deflection behavior hinders force ramping in a static procedure. This clearly exposes the problem that there is no equilibrium state close to the equilibrium state when the load is around 2.0. This behavior stems from the default amplitude setting in Abaqus static simulations, complicating force application.







There are a couple of solutions that can be considered to address this problem:




Interpolating a displacement corresponding to a load of 2.0 and rerun.



Run a two-step simulation and switch to force control as shown below.



Use the relatively new *STEP CONTROL option.












This example showcases the broader complexities of nonlinear simulation debugging, highlighting the need for a methodical approach to identifying and resolving convergence problems.







Example: Plate Through Ring







The second example illustrates a convergence issue frequently encountered in finite element analysis using a thin elastic disk pulled axially through a rigid ring with an elliptical cross-section. The disk is supposed to be pulled entirely through, but the simulation fails, leading to the dreaded message indicating that the analysis has not been completed. This example highlights a common frustration due to the convergence problem during such simulations.







Let us view the load step that I created. The step already uses displacement-controlled loading so there is no hope of solving the problem by switching to it. &nbsp;



The important model features include:




Quadratic brick element type C3D20RH



Hyperelastic material



General Contact



Boundary Conditions



Rigid body constraint



Static procedure




Once we have identified the model&#8217;s key features, the next step is to identify the pitfalls and understand how to use these features to produce a quality result. Understanding stability in the context of hyperelastic materials before using them in a model is vital since hyperelastic materials should be stable in the range of expected strain.



We may encounter issues with contact. For example, we may need to activate the unsymmetric solver when faced with a convergence problem with the contact model in Abaqus/Standard. Contact may need the unsymmetric solver, especially if there is friction. We must avoid over-constraints and watch out for a conflict with the static procedure when simulating a quasi-static process like this one.







The model makes it only partway through the step, prompting the message indicating the analysis has not been completed. There are numerous negative eigenvalue messages in this message file. Based on the residuals and the time average force, it is evident that the model is in an acceptable equilibrium state.











Let us formulate a reasonable working hypothesis at this stage. We are using contact best practices. We have general contact, and the unsymmetric solver is activated due to friction. The hyperelastic material is stable. Because of their positions, there is no possibility of over-constraints.



Numerous persistent negative eigenvalue messages exist. This, together with the animation of the partial results, leads to a hypothesis of buckling. We need a solution strategy to continue the post-buckling behavior and pull the disk through completely. Various techniques, such as the Riks method, static stabilization, quasi-static implicit dynamics, and explicit dynamics could help us resolve this problem.



Let us switch from static procedure to quasi-static implicit dynamics in Abaqus/Standard.







We allow for a very small increment size and increase the number of cutbacks. In some cases where the simulation moves along smoothly with a reasonably large increment size but buckling behavior is observed, we must transition to a small increment size. Sometimes, it may take a minimal increment and more than five cutbacks, the default number. When conducting implicit dynamics, it is crucial to consider time as physical and not a normalized quantity like in static analyses.











The solution to this convergence issue lies in modifying the simulation procedure. The problematic region of the curve is navigated, allowing the simulation to be completed successfully.



This example highlights the importance of understanding the simulation&#8217;s specific characteristics and applying tailored strategies to address convergence issues in complex finite element analyses.



Example: O-Ring Compression and Relaxation











The final example is a hyperelastic/viscoelastic O-ring being compressed and then relaxed. The green circular component in the image above represents the O-ring. In a static step, the rigid plate compresses the O-ring into a groove in an elastic material. The plate is held fixed while the seal relaxes in a second step using the *VISCO procedure.



This model features include general contact, boundary conditions, and symmetry planes. When the analysis was run, the first step was not completed. The debugging process involves thoroughly examining partial results, animations, and the message file to identify the source of the problem.







The message file indicates that while the equilibrium is good, there are problems with contact. Note the messages about sticking and slipping.



The working hypothesis is that the O-ring is experiencing stick-slip behavior, which causes problems with the static procedure. A workaround is to use quasi-static implicit dynamics. In this case, switching the procedure makes things worse, which can happen. The message file indicates a contact problem with edge-to-face contact at nodes 4 and 4559, which are at the sharp edge of the groove.







Let us view the edges that are in the general contact. We can visualize GENERAL_CONTACT_EDGES_3 in the contact domain using Abaqus/Viewer. We notice that there are unwanted edges on the symmetry boundaries. There are options in general contact in Abaqus/Standard to remove edges from the contact domain. Let us remove those and try again.



Once the whole model is running, we can consider reverting to a static procedure. Making one change at a time works best so the procedure is not changed.







We are changing the contact to eliminate the edges on the symmetry plane and switch the first step to a dynamic procedure. This gives me physical time and a viscoelastic effect. It also offers us an inertial effect. This means we must pay close attention to the time for this step.



I am using a time of 0.1 for that step and a relaxation time of 30 seconds. Since the rate of loading for this small O-ring isn’t long compared to the relaxation time, 0.1 seems like a good time to use for that step.



Although step 1 is successful, the simulation fails yet again. We get past the compression step and into the VISCO step, but then it fails. By analyzing the partial results, animations, and the last saved result in step two, a deformation pattern is observed in the reduced integration mesh.







A plot of the seal&#8217;s deformed shape indicated the hourglassing of the C3D8RH elements. This is where perseverance is required. We can eliminate the reduced integration elements and put in fully integrated elements to eliminate the hourglassing effect. The next step is to switch the brick element to C3D8H and try again. Rerunning the simulation offers a complete result. The solution is now successful. We can work on model refinements at our leisure.















I decided to refine the mesh and round the edge of the groove. Often, convergence is more accessible to obtain when the edge is rounded. The radius of the round should be large enough to allow the mesh to conform to it. If you need a sharper edge, use a more refined mesh.











Once we implement these changes, the simulation is successful. This example has demonstrated the importance of persistence and adaptability in resolving convergence challenges.



General Debugging Techniques and Strategies for Finite Element Models



Debugging Abaqus/Standard convergence issues can be daunting. A checklist is crucial to identifying and resolving these challenges in your model.



Here are the key steps to keep in mind to effectively debug a model in Abaqus/Standard:




Know your model features and how to use them properly.



Identify the potential pitfalls that your model features can cause.



Apply all available information to form a hypothesis.Partial results are beneficial. Remember not to artificially limit the output while debugging.

Read the error/warning messages and analyze the message file.





Define a workaround for the hypothetical problem.



It isn’t uncommon for initial attempts to fail. Perseverance is the key to completing the analysis.




The last thing to remember is that through training and experience, one can understand the pitfalls of specific features. Training and experience allow users to sharpen their ability to hypothesize and implement practical solutions for convergence issues.



Here is a list of recommended training classes and resources for further learning.




Obtaining a Convergence Solution with Abaqus



Modeling Contact and Resolving Convergence Issues with Abaqus



Abaqus/Explicit: Advanced Topics



Buckling, Post Buckling, and Collapse Analysis with Abaqus



Any training class is beneficial.




These educational resources are available on the SIMULIA website. I encourage you to explore these training opportunities if you are looking to enhance your problem-solving capabilities in debugging convergence issues with Abaqus/Standard.



Conclusion



In conclusion, debugging convergence issues in Abaqus/Standard is an intricate process that requires a deep understanding of the model&#8217;s features and potential problems. Training and experience assist in accurate hypothesis formulation to implement successful solutions. The Newton-Raphson method is central to the solver, and when non-convergence occurs, a systematic approach is employed to diagnose and remedy the issues.



Partial results, error messages, and animations are invaluable in this process and guide modifications to achieve successful outcomes. Persistence is required as initial attempts may only sometimes resolve the issues.



Eventually, a systematic and knowledgeable approach, backed by perseverance, is pivotal to mastering the debugging of finite element models in Abaqus/Standard.



To learn more about mastering debugging in Abaqus/Standard, we invite you to access our recorded session here.











Interested in the latest in simulation? Looking for advice and best practices? Want to discuss simulation with fellow users and Dassault Systèmes experts?&nbsp;The&nbsp;SIMULIA Community&nbsp;is the place to find the latest resources for SIMULIA software and to collaborate with other users. The key that unlocks the door of innovative thinking and knowledge building, the SIMULIA Community provides you with the tools you need to expand your knowledge, whenever and wherever.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Optimizing Battery Range and Thermal Comfort in Electric Vehicles ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/simulia/optimizing-battery-range-thermal-comfort-electric-vehicles/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/274518</guid>
      <pubDate>Thu, 05 Dec 2024 08:00:00 GMT</pubDate>
      <description>
      <![CDATA[ This blog explores how Dassault Systèmes employs cutting-edge simulation technologies and advanced computational tools such as Computational Fluid Dynamics (CFD) and 1D system modeling to accurately simulate and optimize these complex interactions and thereby improving both battery efficiency and cabin comfort in EVs
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Introduction



The electric vehicle (EV) industry is making great strides towards sustainable mobility, pushing the boundaries of what’s possible with clean energy. Although EV sales currently account for around 16% of the global market, they’re expected to rise sharply to 50% by the end of 2035 (see figure 1) as manufacturers work to overcome significant challenges like manufacturing challenges such as material availability for battery and electric components, as well as availability of charging infrastructure, to vehicle thermal challenges such as battery range and thermal comfort, cost of EVs to be on par with ICE vehicles, model proliferation, multi-physics engineering development, range anxiety etc. Among the most pressing of these are battery efficiency and occupant comfort, particularly under varying climatic conditions.



Figure 1: Global EV sales forecastSource: https://ev-volumes.com/news/ev/evs-forecast-to-account-for-two-thirds-of-global-light-vehicle-sales-in-2035 







In colder climates, for example, EVs lack the waste heat from internal combustion engines that would otherwise help heat the cabin. This leads to increased power consumption by climate systems, which in turn reduces battery range. This blog explores how Dassault Systèmes employs cutting-edge simulation technologies and advanced computational tools such as Computational Fluid Dynamics (CFD) and 1D system modeling to accurately simulate and optimize these complex interactions and thereby improving both battery efficiency and cabin comfort in EVs.



Motivation: Why Thermal Management is Key for Electric Vehicles



Thermal management directly impacts an EV’s performance, as it influences both the efficiency of the battery and the comfort of passengers. In comparison to ICE vehicles, EVs must expend additional energy to heat the cabin, which can reduce range by more than 15% in hot or cold conditions and the range can drop by up to 40% in extreme weather conditions (see Figure 2).



Figure 2: Winter range for popular EV modelsSource: recurrentauto.com  







Thermal comfort is particularly relevant to drivers in extreme climates who rely on climate control features like seat warmers, cabin heaters, and defrosters. These features consume a significant portion of the EV&#8217;s battery power, creating a trade-off between energy used for passenger comfort and energy available for range.



Challenges in the EV Thermal Management Landscape



Real-world applications come with their own set of challenges. Extreme temperatures, complex thermal interactions, and fluctuating driving conditions all impact energy consumption in ways that require ongoing refinement of EV designs. For instance, urban stop-and-go traffic cycles and high-speed highway driving affect airflow and cooling needs differently, meaning that a single HVAC configuration is unlikely to suit every scenario.



Beyond comfort, EV designers also face the task of balancing battery health and efficiency. Batteries perform optimally within a specific temperature range. Excessive heating or cooling can accelerate degradation, affecting long-term performance and lifespan. Advanced simulations like those used in this study help identify the optimal trade-offs between battery protection and energy efficiency.



The Role of Advanced Simulation: To tackle this challenge, Dassault Systèmes has leveraged a co-simulation approach that integrates detailed 3D CFD models with system-level models. This approach allows engineers to simulate how heat moves through the vehicle and how energy is used by the climate system, helping to optimize both comfort and efficiency.



Methodology: A Multiscale Approach for Better Range and Comfort



A study conducted by Tesla shows that in electric vehicles, optimizing one of the components individually results in 15-25% improvement in e-drive efficiency, whereas optimizing the overall system as a whole improves the e-drive efficiency by 40%. Range is a problem, which should not be looked at through a single component in the system, but as a vehicle overall.



An advanced virtual twin can effectively assist in determining the optimal size of the HVAC system while enabling early-stage predictions for battery life, vehicle range, and passenger comfort. This study utilizes a co-simulation approach combining system-level and 3D CFD models i.e., by merging the results from 3D thermal optimization into overall system to optimize the efficiency to capture complex, multi-scale interactions and enhance overall vehicle performance. Through the integration of 3D CFD with Finite Element Analysis (FEA) thermal models, this approach provides precise predictions of passenger comfort levels and battery temperature distribution. Furthermore, by incorporating Dymola system behavior models, real-world driving scenarios for all vehicle systems can be simulated. 1D system simulation is performed with Dymola and the heat exchanger inlet temperatures are determined and provided as input to the 3D CFD simulation. As a next step, 3D CFD simulation is performed with PowerFLOW / PowerTHERM and the cabin outlet temperature is provided back to the 1D system model as input and the cycle repeats (see Figure 3). This approach benefits from the detailed response trends provided by 3D CFD analysis, facilitating quick adjustments via 1D models to optimize vehicle performance.



Figure 3: Coupled 3D CFD – 1D system Model simulation process  







Additional 3D CFD analyses assess Underhood airflow, allowing engineers to characterize how external air affects the HVAC heat exchangers under various conditions. This airflow data is then applied to the 1D HVAC system analysis to establish realistic boundary conditions.



Figure 4: Coupled 3D CFD – 1D system Model simulation  







System Model



The first step is creating a 1D model of the HVAC system using Dymola, which assesses energy consumption, cabin temperatures, and vehicle range under different drive cycles, vehicle aero changes and HVAC modes. This model helps determine the size and power needs of the HVAC system to balance thermal comfort with battery efficiency. Dymola Modelica library offers a wide variety of predefined models and libraries that make our life easier in modelling the system models, like…




Realistic drive cycles for various speeds (for e.g., high and low WLTP drive cycles)



Battery library to model range, ageing, time to charge and cooling



HVAC library to model driving modes, flow rates and thermal exchange



Driveline and Chassis to model aerodynamic properties of vehicle




Figure 5: 1D system Model of an EV  







3D CFD for Localized Thermal Comfort



Using CFD simulations, one can analyze how heat moves through the cabin, affecting each passenger’s thermal sensation alongside the temperature of electronic components like display monitors, mobile rest etc. This level of detail provides a realistic view of representation of different modes for different climatic conditions and comfort across different body areas and helps inform design decisions for optimal airflow and heating distribution.



Figure 6: 3D CFD Cabin simulation of an EV  







Underhood Cooling Airflow Analysis



A separate 3D CFD model simulates airflow around the vehicle’s heat exchangers, which are part of the HVAC system. This analysis captures how external air impacts cooling and heating, allowing for realistic boundary conditions and ensuring that all HVAC components operate efficiently.



The initial step involves using detailed 3D CFD analysis to characterize the external airflow that reaches the front end of the HVAC system&#8217;s heat exchangers under various vehicle operating conditions. A drive-cycle-specific Design of Experiments (DoE) approach is applied to select sample points that comprehensively represent these conditions based on their frequency of occurrence. This process, illustrated in Figure 7, defines the entire approach for characterizing airflow beneath the hood.



Figure 7: Underhood cooling airflow characterization 







Next, CFD simulations are conducted for each sample point to determine the heat exchanger’s inlet mass flow rate and back pressure. A response surface model is then constructed using 2D linear interpolation to estimate values between these sample points. This model provides realistic boundary conditions for the 1D HVAC system analysis, allowing further simulations with accurate inputs for the heat exchangers.



Real-World Application: Analyzing Drive Cycles and Compressor speeds for Optimal Efficiency and Comfort



A transient coupled 1D-3D simulation was conducted on a compact passenger EV. This vehicle features a battery pack of 384 prismatic cells with a total capacity of 15.6 kWh. The HVAC system is designed to cool the cabin in summer and serve as a heat pump during winter. For this analysis, the focus was on a cold-weather scenario, simulating a drive in -10°C ambient temperatures for 30 minutes physical time, driver and passenger human comfort model and 60% face, 40% foot HVAC flow split mode.



The Worldwide Harmonized Light Vehicles Test Procedure (WLTP) drive cycles were used, as it reflects typical urban and highway driving patterns globally. The simulations explored two different drive cycle speeds, a regular drive cycle and an additional low-speed WLTP cycle to represent city driving, allows for realistic evaluation of the HVAC system under variable conditions. Three different compressor speeds high, medium and low were analyzed in the study, with particular attention on the HVAC system’s energy consumption, as it directly impacts passenger comfort. Additionally, the expected range for each configuration was estimated by analyzing the battery’s state of charge at both the start and end of the drive cycle.



Figure 8: Regular and low-speed WLTP drive cycles  







Results



The thermal comfort levels achieved are comparable across all three scenarios. After 10 minutes, occupants feel comfortable in each case. The high compressor speed scenario reaches comfort level 0 in 4.5 minutes, the medium speed in 6 minutes, and the low speed in 6.5 minutes. After 10 minutes, the low-power scenario offers the same level of comfort as the high-power scenario while consuming 50% less energy. Further analysis indicates that overall comfort is primarily influenced by the comfort level of the body&#8217;s breathing sensation. This breathing sensation is less affected by design variations among the three speeds compared to other body parts, resulting in a smaller-than-expected overall change in comfort.



Figure 9: (a) Overall comfort for different compressor speeds







Medium and high compressor speeds consume 33.5% and 50.2% more energy compared to low speed. Moreover, at low compressor speeds, the battery drained more slowly, providing a range increase of up to 21% under some drive conditions. This test showed that reducing HVAC power can significantly enhance range without sacrificing passenger comfort.



Figure 10: (a) Energy savings (b) Range difference








High Speed: Achieves maximum comfort quickly but consumes more energy.



Medium Speed: Balances comfort and energy efficiency.



Low Speed: Consumes the least energy, with only a slight delay in achieving optimal comfort.




Conclusion: Paving the Way for Smarter EV Design



As electric vehicles become more prominent, tools like co-simulation are essential in crafting a user-friendly and competitive product. By blending 3D CFD with 1D system modeling, manufacturers can optimize a vehicle’s thermal performance and ensure it meets consumer expectations for both range and comfort. Together, these simulations allow engineers to fine-tune vehicle parameters, such as compressor speed, in response to different driving cycles. For instance, reducing the compressor speed in the HVAC system can lead to significant energy savings without greatly impacting comfort levels.



This holistic approach promises a range of benefits:




Improved Range: Strategic HVAC adjustments can preserve battery life and extend driving range, reducing range anxiety.



Enhanced Comfort: Precision simulations ensure that passengers enjoy a comfortable cabin experience, even in extreme climates.



Energy Efficiency: Optimized HVAC systems lead to lower power consumption, helping to meet energy targets.




As simulation technology advances, EV designers are increasingly able to predict real-world outcomes in a virtual environment. By adopting these methods, the automotive industry is poised to make electric vehicles not only a viable alternative to traditional cars but a preferred choice for eco-conscious drivers worldwide. The workflow presented here also exemplifies Dassault Systèmes’ approach to MODSIM, integrated modeling and simulation and supports the company’s long-term strategy to bridge the gap between designers and simulation engineers, ultimately speeding up the product development process.











Interested in the latest in simulation? Looking for advice and best practices? Want to discuss simulation with fellow users and Dassault Systèmes experts?&nbsp;The&nbsp;SIMULIA Community&nbsp;is the place to find the latest resources for SIMULIA software and to collaborate with other users. The key that unlocks the door of innovative thinking and knowledge building, the SIMULIA Community provides you with the tools you need to expand your knowledge, whenever and wherever.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Unlocking Value in Mining ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/unlocking-value-in-mining/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/274447</guid>
      <pubDate>Mon, 02 Dec 2024 11:59:53 GMT</pubDate>
      <description>
      <![CDATA[ The bottom line is that a centralised system deployed either on premises or on cloud would make the execution of the company’s Disaster Recovery Plan way more straightforward compared with decentralised systems.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Gustavo Pilger, Dassault Systèmes, GEOVIA on the critical role of centralized data management for efficiency, security, and innovation in mining operations.



The rapid growth of data, driven by technological advancements, presents both benefits and risks. Consolidating and centralizing data management is important to enhance efficiency, cybersecurity, and business intelligence. We discuss the critical challenges and opportunities in managing data within the mining industry and also explore the role of Dassault Systèmes&#8217; 3DEXPERIENCE platform in enabling mines to optimize processes, ensure data security, and improve disaster recovery capabilities.



Q) Please outline the key challenges that the mining industry faces in managing, protecting, and storing data.



Data is part of the IP portfolio of a company (together with a range of assets).  Therefore, it should be managed as any other valuable asset. Over the last 2 decades in particular, with technology advancements and the advent of a range of sensors, we have seen an &#8220;explosion of data&#8221; across industries including mining. 



This brings opportunities and challenges at the same time. The opportunities are mainly associated with the potential to better understand processes enabling one to improve them with productivity and efficiency gains that often lead to cost savings.



To achieve this state, however, one needs to overcome a few challenges: from navigating through a plethora of data for extracting knowledge to cybersecurity risks that could expose corporations to significant financial losses. The ability to count with a range of data to unlock or optimize mining processes is great. 



However, one of the first challenges is to consolidate the data that is often captured and stored in different systems. Not only these data are stored in decentralised (local) disparate repositories, but these systems are administered by different people with different levels of responsibility and awareness when it comes to data integrity and related risks. 



So, it is important that data is properly stored and managed in a way that allows one to extract the most knowledge out of them while preserving its integrity and exposure.



Q) How should mining companies approach consolidating and centralizing their data management to enhance data security?



The first step towards data consolidation is to compile a data inventory across the mine including information about type, format, purpose, frequency of change, etc. This allows one to map out the data flow intra- and inter-processes across the mine to then assess what matters the most and where potential bottlenecks are in order to prioritise where to begin. 



Therefore, understanding the data ecosystem together with the impact they have across KPIs is key to drive change in this space.



All sorts of data are being collected from a range of equipment (including sensors) across the environment of a mine. Together with good, valuable data also comes noisy data &#8211; and lots of them.  



Therefore, ideally, the data collected across the mine not only needs to be federated (or consolidated), but also needs to be indexed, sanitized (filtering out the noise), and contextualized so that meaningful insights can start to be extracted for decision-making.  



This could be achieved with the adoption of a centralized system that allows ingesting data collected by equipment across the mine, as well as their management in a safe and secure environment. The Dassault Systèmes 3DEXPERIENCE platform offers this solution.



Q) What critical benefits do mines gain from centralizing their data management?



I think the ultimate benefit is about being in control of the data instead of data taking control! One can only improve what is measured and understood! 



A centralized platform that allows data federation, indexation, 3D contextualization, analytics and action management, all in a secure environment, puts you in control of your assets allowing to extract the most value out of them.




Also, typically with decentralised systems, a great amount of time is dedicated to finding the right data or the latest version of data to work with. This translates to enormous inefficiencies, errors, re-work, and frustration leading to employee disengagement creating a vicious cycle of inefficiency. On the other hand, a centralised system, with rigorous access control processes, eliminates these inefficiencies. 




Every employee has access to the right data, in terms of permissions and versioning for conducting his/her work. Every decision taken by employees is recorded and justified within the system providing an inherent layer of traceability and auditability. Other benefits include de-risking data integrity and exposure.



Q) Tell ua about the role of centralized data management in improving data analytics and business intelligence, and how this benefits mines and their personnel.



GEOVIA, a Dassault Systèmes brand, provides software tools that allow our mining clients to model and simulate processes and how they interact with adjacent (connected) processes before anything is actually built, in early project development phases, or to correct the train of action on projects already in production in order to keep chasing value while operating.



Since the underlying data is federated, indexed, standardized and contextualized in a safe and secured single repository, and systems are connected with input and output associated through common data models, one can test multiple hypotheses or scenarios in the virtual world (Virtual Twin Experience) to efficiently apply a given design or plan in the real world &#8211; eliminating unnecessary waste, reducing risk, minimizing material re-handling while maximizing productivity! 



Data is not only safe and secured, but it is indexed (for quick retrieval), standardized through semantic dictionaries and contextualized, enabling meaningful link and associativeness between processes and data.



It is this data associativeness combined with smart methods and algorithms that allows one to constantly chase value while in operation, adjusting to (previous) uncertainty and unplanned events (being of technical, mechanic, or of market nature). 



I’d like to emphasize that having this core data, industry knowledge and know-how supported by semantic dictionaries (ontologies) central to our business platform (3DEXPERIENCE) that is built on a multi-physics and multi-scale foundation allows us to go beyond Generative AI and Large Language Models (LLMs). 



With this core set of characteristics, what we offer instead is Industry Language Models (ILMs) that indeed leverage LLMs but are combined with ontologies and industry knowledge and know-how within a platform environment (3DEXPERIENCE) that inherently provides governance and traceability.



Q) Please explain the ways in which centralized data management enhances a mine’s disaster recovery capabilities and why this is critically important?



A&nbsp;decentralised data management system, with data fragmented and scattered across the corporation, would need to rely on systematic discipline by personnel in charge to regularly back up local stored data, which could be a challenge by itself. Therefore, it&nbsp;would make it really hard (if not impossible)&nbsp;to fully recover should a disaster were to occur.



Instead, a centralised system can be restored in a matter of hours in case of disaster. Of course, assuming appropriate levels of redundancy, training and protocols would be in place to allow minimum levels of disruption in case of disaster.



The bottom line is that a centralised system deployed either on premises or on cloud would make the execution of the company’s Disaster Recovery Plan way more straightforward compared with decentralised systems.



Q) Ultimately, how does centralizing data management improve both a mine’s cybersecurity and the safety of its employees?



Data centralisation enables to significantly reduce risks associated with data integrity and cybersecurity. Consolidating the data in a single repository reduces the risk of losing or corrupting data that otherwise would reside in local drives of desktop computers located across mine sites, or into laptops of those employees required to work on the data. 



Instead, on a centralised system such as the 3DEXPERIENCE, the right version of the right data is available at any time to the right people. Since 3DEXPERIENCE counts with a rigorous access control process, this means that data is made available to employees according to their roles and needs. 




For example, a Surveyor does not need access to sensitive data such as gold grades from core logging, while a Resource Geologist needs it as it is required for him/her to conduct their work. So, all this combined mitigates quite significantly risks associated with data integrity, exposure and cybersecurity.




For those who choose to embrace the cloud to store and manage data via a cloud provider, be assured that the risks are well managed. Risks are arguably better managed than in in-house data centres. 



This is because most cloud vendors, such as Dassault Systèmes, operate with heightened security practices tailored towards protecting their infrastructure, applications, and customer data. A good cloud provider will adhere to industry standards and best practices that include:




IOS 2700x standards, and in particular implementation Guide ISO 27002



NIST 800 series



OWASP (Open Web Application Security Project) methodologies



CobIT framework




Also, good cloud providers employ multiple, independent and redundant mechanisms at various levels to block attacks. These measures provide far better security than most organisations can provide for themselves.



Therefore, in terms of risk management, it is a win-win proposition for all, including corporations, employees, contractors, and customers.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ AI augments Engineers for Sustainable Innovation ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/catia/ai-augments-engineers-for-sustainable-innovation/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/274344</guid>
      <pubDate>Thu, 28 Nov 2024 14:59:56 GMT</pubDate>
      <description>
      <![CDATA[ For over 40 years, Dassault Systèmes, through its leading brand CATIA, has been at the forefront of industrial transformation and continues to be so today by using artificial intelligence (AI) to drive sustainable innovation.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
How do we navigate AI for Industrial Transformation?



From design to industrialization, 80% of vehicles and aircraft today have developed their complete digital twin and now their virtual twin with Dassault Systèmes, especially with CATIA. Those solutions help the company extend its reach to other critical sectors, such as smart cities, energy infrastructures, and even the human heart.



AI serves not just as a tool for automation but as an enhancer of human capability, allowing engineers, designers, and architects to explore more innovative and sustainable solutions within development cycles. The core of AI lies in data, transforming it into valuable knowledge and expertise. This resembles a modern renaissance, where accumulated industrial know-how is made accessible across the entire product creation and industrialization value chain.




We are not automating to replace engineers, but augmenting their abilities to explore more innovative and sustainable solutions.Olivier SAPPIN, CATIA CEO








Virtual Twins: Bridging the Virtual and Real Worlds



Virtual twins are pivotal in integrating AI with industrial data, offering a comprehensive digital representation of physical assets. By creating virtual twins of products such as vehicle batteries, Dassault Systèmes with the 3DEXPERIENCE platform enables detailed simulation and testing in a virtual environment, reducing the need for costly and time-consuming physical prototypes.



Detailed Simulation with Virtual Twins




&nbsp;Simulation and Testing: Virtual twins allow the simulation of product characteristics such as battery autonomy, heat resistance, and steering angle, predicting performance under various conditions.



&nbsp;Design Optimization: Engineers and designers can refine product designs virtually, enhancing efficiency and sustainability at the early concept before physical production begins.








This approach not only saves time and resources but also enhances competitive advantage by enabling rapid prototyping and testing.



AI-Driven Enhancements in Virtual Twin Solutions



AI is crucial for advancing virtual twin solutions. It provides predictive insights that accelerate design and decision-making processes. AI can foresee potential issues and optimize performance by capturing and analyzing data from both virtual models and real-world usage.



Predictive Capabilities




&nbsp;Rapid Prototyping: AI algorithms enable quick generation and testing of multiple design alternatives within CATIA



&nbsp;Risk Management: Predictive analytics help in identifying and mitigating risks in complex projects, ensuring timely and cost-effective delivery.













&#8220;AI allows us to explore solutions rapidly, providing insights that would traditionally take months to discover.&#8221;








One Single Source of Truth



For more than 13 years, Dassault Systèmes has introduced the 3DEXPERIENCE platform to capitalize on the potential of virtual twins. This data-driven platform elevates information to knowledge and expertise, making it accessible and reusable across various domains.



Some key features:




&nbsp;Data Structuring: Ontologies are used to intelligently structure data, enhancing its usability.



&nbsp;Integration of Real and Virtual Data: By combining real-world usage data with virtual models, the platform delivers comprehensive insights for decision-making.












Protecting Industrial Data in the Age of AI



Data protection is paramount as industries leverage AI for innovation. Data anonymization ensures that proprietary industrial knowledge remains secure while benefiting from AI advancements.



Data Security Strategies




&nbsp;Anonymization Techniques: Used in sectors like healthcare, these techniques protect sensitive data while enabling advanced simulations.



&nbsp;Proprietary Data Utilization: AI is trained on industrial data for the exclusive benefit of its originators, ensuring competitive advantage.




Ready to embrace the Generative Economy



Dassault Systèmes prepares industries for a generative economy, where the focus shifts from products to experiences, with a new factor in the balance: circularity. By leveraging AI and virtual twin technology, engineers and designers can innovate sustainably, reducing development cycles and enhancing operational excellence.




AI gives engineers superpowers to accelerate product development, reduce life cycles, and optimize production.Olivier SAPPIN, CATIA CEO








As industries face increasing demands for competitiveness and sustainability, adopting these advanced technologies becomes imperative. Watch our latest webinar to learn how you, as an industrial company, can harness generative AI for your business needs.



And explore how CATIA transform your design and decision-making processes by integrating AI and virtual twins into your operations.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Build a Better Battery Cell with Simulation-driven Engineering ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/simulia/build-better-battery-cell-simulation-driven-engineering/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/274102</guid>
      <pubDate>Tue, 26 Nov 2024 16:28:56 GMT</pubDate>
      <description>
      <![CDATA[ Simulation helps engineers to enhance battery cell design and develop new cell technology. In this blog post, we will introduce the Battery Cell Engineering workflows from SIMULIA on the 3DEXPERIENCE® platform and demonstrate how they can be utilized to create high-performance battery systems.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Introduction



Batteries are becoming increasingly important in our daily lives, from smartphones to electric cars to large-scale power grid storage. As electrification becomes more widespread, batteries with higher capacity, lower cost and weight, longer lifespans, and the ability to meet strict operating conditions and safety standards will be needed. Companies that can meet these requirements will have a significant competitive advantage.



The process of developing an improved battery begins at the cell level. Cells are the fundamental units of batteries, comprising electrodes and electrolytes. A battery pack is formed by connecting multiple cells, often with added structural, thermal and control elements.



Simulation helps engineers to enhance battery cell design and develop new cell technology. In this blog post, we will introduce the Battery Cell Engineering workflows from SIMULIA on the 3DEXPERIENCE® platform and demonstrate how they can be utilized to create high-performance battery systems.



Challenges of Battery Engineering











When developing a battery system, engineers must consider many competing design requirements. The following examples are from electric vehicles, but other industries have similar needs:




Capacity (driving range): The battery should have the maximum possible capacity to minimize the frequency of recharging and extend the overall lifespan of the device.



Charge time: The faster the battery charges, the sooner a driver can return to the road.



Weight: A lighter battery leads to quicker acceleration and improved energy efficiency.



Longevity: The car battery is a costly component. A longer lifespan reduces maintenance costs and increases resale value.



Temperature: Charging and discharging produce significant heat inside the battery, requiring it to be cooled in hot weather and warmed in cold weather.



Safety: The battery must withstand the stresses and vibrations of use and remain safe even in a crash.




To achieve all these design goals and find the best trade-offs, engineers need to understand not only how the cell behaves in the lab but also how it will perform during real operating conditions



Recent advancements in the battery industry have also made development more challenging. Suppliers and manufacturers create a more intricate supply chain as the industry expands. Developing battery cells, manufacturing at scale and integrating them into vehicles or devices can involve numerous players, each operating at different levels of detail, from the molecular to the system. Established cell manufacturers face competition from startups and joint ventures with other industries, such as automotive and energy, are increasingly common. Cell manufacturers are exploring new technologies such as solid-state electrolytes and sodium ion cells.



Why Simulate Battery Cells?











Test on a Virtual Twin Without a Prototype



Simulation enables engineers to meet these challenges. With simulation, engineers can analyze battery performance without a physical prototype using a virtual twin. This digital representation of the battery includes all the relevant data—such as geometry, electrode and electrolyte properties and their interactions—needed to represent its real-world behavior accurately.



Virtual twins need to capture the complex geometry of battery cells, such as layered cylindrical (“jellyroll”) designs. The 3DEXPERIENCE Battery Cell Engineering solutions help design the layered 3D battery cell geometry and convert them into detailed, realistic simulation-ready models. After simulation, every aspect of the cell, such as temperature distribution or ion concentration, can be visualized in 3D.



The virtual twin can be analyzed at any stage of development, from very early in the design phase to before constructing a physical prototype. This allows for comparing different concepts and optimizing design parameters to ensure that the design will meet the requirements before committing to a specific design. The risk of potential failure, costly rework, and project delays are minimal.



Optimize Electrochemistry for Efficient Performance











The performance of a battery in charging, storage and discharging is determined by its electrochemistry. This complex multiphysics, multiphase phenomenon is determined by the 3D structure of the cell and interplay between the electrode and the electrolyte. Analyzing these with testing is time-consuming and measurement limitations inherently constrain the insight provided.



The Battery Cell Engineering solution provides an extended, 3D porous electrode theory (PET) based on the Newman model to simulate the cell’s performance in real-world situations. This models the electrochemistry within the cell, considering both micro-scale and macro-scale details. &nbsp;The different aspects of physics – structural, thermal, electrochemical &nbsp;and pore pressure – are considered together. Engineers can analyze factors such as charge/discharge behavior at different charge rates under different mechanical and thermal conditions. As the simulation is in 3D, users can also assess and predict three-dimensional behaviors such as thickness deformation and stress caused by swelling.



Ensure Safety in Real-world Scenarios











Battery cells are designed to store high energy densities in a portable way, such as inside a smartphone or an electric car. As a result, they are exposed to many difficult and dangerous scenarios—extremes of heat and cold, bending, impact and penetration. Battery cells need to withstand these hazards—if they fail, they should fail safely.



Simulation can safely replicate dangerous real-world scenarios within a virtual environment. Events such as nail penetration, car crash or thermal runaway can be studied without the cost and risk of constructing and destroying a physical prototype.



Make Batteries a Better Investment with Longer Lifespan and Reliability



Batteries age over time (calendric aging) and through repeated use (cyclic aging). Calendric aging occurs, for example, when a battery is stored out of use, while cyclic aging takes place each time the battery is charged or discharged. The expense of the battery significantly influences the cost of an electric vehicle and one of the primary causes for the rapid depreciation and increased cost of ownership of electric cars is battery aging. Electric vehicles will become a more attractive investment for drivers and fleet managers if battery cells can last longer.



The Battery Cell Engineering solutions on the 3DEXPERIENCE platform provide comprehensive workflows to simulate these aging processes. It can model various battery aging mechanisms, such as formation &amp; growth of the SEI, lithium plating, and dissolution of the cathode. By analyzing these effects, engineers can optimize the battery&#8217;s lifespan and produce more reliable batteries that customers demand.



Explore Battery Science Down to Cell Chemistry Optimization 



Battery Cell Engineering on the 3DEXPERIENCE platform combines SIMULIA multi-physics simulation workflows with key capabilities from BIOVIA for scientific chemical and material engineering and CATIA for design and modeling. Together, these support battery engineers in designing, analyzing, optimizing and validating battery cells using 3D virtual twins.



All process stages take place in the same environment: the 3DEXPERIENCE platform, which provides a single source of truth for all the battery cell engineering data. Designers, analysts and other stakeholders can share information and collaborate reliably and securely. Unified modeling and simulation (MODSIM) helps to left-shift the analysis process so that cell designs can be optimized earlier and potentially identified and resolved, ensuring a more consistent, error-free and accelerated design cycle.



The highly detailed 3D Newman modeling in the Battery Cell Engineering tools on the 3DEXPERIENCE platform is the key to creating realistic simulations of the cell’s thermal and electrochemical behavior. These simulations provide the highest quality predictions about the cell&#8217;s performance and age, including any impacts from its use in different conditions. Microstructural simulations enable deep analysis of material characteristics within the electrodes. Meanwhile, mechanical simulation is used to test the cell’s behavior in events such as thermal stresses, mechanical indentation or nail penetration so that engineers can design for optimal safety throughout the cell’s lifespan.



Conclusion



From smartwatches and phones to electric cars and grid storage, battery performance is crucial to the success of devices large and small. This performance is determined at the battery cell level by the electrochemistry and multi-physical interactions within. Developing an efficient, safe and competitive battery requires understanding the cell&#8217;s complex three-dimensional behavior.



Dassault Systèmes provides a full Battery Cell Engineering solution on the 3DEXPERIENCE platform. This solution integrates the best design and simulation solutions into a workflow. Using these tools, battery designers can analyze battery performance accurately from the comfort of their desks without having to build physical prototypes.



With the Battery Cell Engineering solutions on the 3DEXPERIENCE platform, battery cell manufacturers can enable collaboration between all stakeholders and left-shift analysis in the development cycle. Potential safety and efficiency problems can be resolved early without extensive re-designs that cause delays and cost overruns. With simulation, battery cell manufacturers can develop innovative and competitive new products while cutting R&amp;D costs and time-to-market.



For more information, see the on-demand webinars:



https://events.3ds.com/battery-cell-engineering-faster-modsimhttps://events.3ds.com/future-aircraft-development-modsim











Interested in the latest in simulation? Looking for advice and best practices? Want to discuss simulation with fellow users and Dassault Systèmes experts?&nbsp;The&nbsp;SIMULIA Community&nbsp;is the place to find the latest resources for SIMULIA software and to collaborate with other users. The key that unlocks the door of innovative thinking and knowledge building, the SIMULIA Community provides you with the tools you need to expand your knowledge, whenever and wherever.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Unveiling the New Era in System Engineering with CATIA Magic ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/catia/unveiling-the-new-era-in-system-engineering-with-catia-magic/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/274070</guid>
      <pubDate>Tue, 26 Nov 2024 09:45:04 GMT</pubDate>
      <description>
      <![CDATA[ The integration of system engineering models takes a leap forward with the new CATIA Magic version, promising more dynamic diagrams and early-stage simulations to enhance product realization.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
In this video, David Augendre from STELLANTIS discusses the new version of CATIA Magic and its impact on system engineering. The updated software facilitates smoother integration of various models across different domains, allowing for dynamic diagrams and early-stage product verification. CATIA Magic helps streamline processes, enhance time-to-market, and optimize organizational and system interfaces by fostering collaboration and digital continuity.



The new version of CATIA Magic facilitates smoother integration of different models across various engineering domains, enhancing system-engineering expertise.



MBSE (Model-Based Systems Engineering) helps bridge the gap between mechatronics and software sectors, aiding in the formalization and design process.



Modeling and formalizing designs through CATIA systems and other tools allow for better project management, identifying progress, delays, and interconnections between different sectors.







Bridging Competence Domains with CATIA Magic



In the constantly evolving world of system engineering, the seamless integration of diverse models across various domains of expertise is crucial. The new version of CATIA Magic propels this integration by enabling a more serene and coherent relationship between different models within system engineering. This advanced tool facilitates the creation of dynamic diagrams, paving the way for simulations and verifications even before product realization.




The new version of CATIA Magic allows for a more relaxed connection of different models across domains of competence and expertise.




This innovation is particularly significant for industries like automotive, where reconciling mechatronic and physical domains with software approaches presents a substantial challenge. Model-Based Systems Engineering (MBSE) comes into play by concretely formalizing design intentions, effectively bridging these two worlds.



Enhancing Project Continuity with Integrated Platforms



The aspiration to design collaboratively has ushered in a shift towards common platforms that ensure digital design continuity. This journey spans from the initial expression of needs to the definitive software definition, streamlining the process across all stages.




 Key Performance Indicators (KPIs): These metrics are crucial in monitoring project deployment, helping to track progress and pinpoint delays in meeting project expectations.



 Interconnection Identification: Formal modeling enables the identification of interconnections among various professionals involved, clarifying interfaces and input-output relations between different domains.





Formalizing through modeling allows us to identify progress and delays concerning project expectations.




This structured approach enhances time-to-market efficiency and aids in defining interfaces, rationalizing processes, and identifying potential bottlenecks or future optimizations.



Driving Collaborative Design with Systemic Modeling



The value added by modeling in system engineering is profound, especially in optimizing systems and organizational structures. By leveraging tools like CATIA Magic, teams can enhance their collaborative design efforts, leading to more effective and efficient outcomes.




 Simulation and Verification: Early-stage simulations help verify the feasibility and effectiveness of designs, reducing risks and improving the design process.



 Interface Definition and Rationalization: Clearly defined interfaces facilitate smoother interactions between domains, minimizing miscommunication and errors.





The added value of modeling enhances time-to-market, interface definition, and rationalization, identifying blockages and future optimizations.








Conclusion



The latest version of CATIA Magic marks a significant milestone in the realm of system engineering. By fostering a more integrated and dynamic modeling environment, it bridges various domains of expertise, enhances project continuity, and drives collaborative design. As organizations strive for digital continuity and efficiency, tools like CATIA Magic stand as pivotal enablers of innovation and progress.



&#8220;Modeling brings added value in optimizing time-to-market, defining interfaces, and identifying potential bottlenecks and future optimizations in both systems and organizational structures.&#8221;







Discover how CATIA Magic solutions can help you, and join the free CATIA MBSE Cyber Systems community to discuss with MBSE experts, and watch tutorials and demonstration videos.
 ]]>
      </content:encoded>
      </item>
    </channel>
   </rss>