<?xml version="1.0" encoding="utf-8"?>
  <rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom">
    <channel>
      <title>Infrastructure, Energy &#038; Materials</title>
      <link>https://blog--3ds--com.apsulis.fr/industries/infrastructure-energy-materials/feed.xml</link>
      <description>Infrastructure, Energy &#038; Materials</description>
      <lastBuildDate>Thu, 05 Mar 2026 16:10:05 GMT</lastBuildDate>
      <docs>https://validator.w3.org/feed/docs/rss2.html</docs>
      <generator>3DExperience Works</generator>
      <atom:link href="https://blog--3ds--com.apsulis.fr/industries/infrastructure-energy-materials/feed.xml" rel="self" type="application/rss+xml"/>

      <item>
      <title>
      <![CDATA[ Benefits and Challenges of Using Big Data in Resource Estimation ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/benefits-and-challenges-of-using-big-data-in-resource-estimation/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/275094</guid>
      <pubDate>Thu, 12 Dec 2024 13:23:32 GMT</pubDate>
      <description>
      <![CDATA[ The good news is that advances in data acquisition technologies mean that a whole new world of soft data – from downhole geophysics, multi- and hyper-spectral core scanning, and more routine collection of geometallurgical parameters – is now available to inform and enhance resource modelling and estimation.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Authored by Michael Mattera, GEOVIA Industry Process Consultant Senior.



To estimate the properties of a mineral deposit as reliably as possible — especially as economic orebodies around the world become increasingly complex — a geologist must thoroughly understand the deposit as well as the method of emplacement/mineralisation.



And the only way geologists can do that is through using sound, dependable data.



Yet traditionally mining companies have relied solely on two exploratory drilling methods to obtain the physical samples, typically the only working data, that resource geologists use to model and estimate mineralisation:




diamond drilling, which involves withdrawing small diameters of core rock for analysis



reverse-circulation drilling, which involves collecting crushed rock cuttings for analysis.




The result is that billion-dollar decisions are based on the physical analysis of a very small amount of material while the bulk of the material to be mined, both overburden/waste and the mineralised orebody itself, remains unexamined.



To ensure higher quality resource estimates, geologists need more data.



The good news



The base data for geological modelling and resource estimation can be classified as either hard data (data that is directly observed and measured), or soft data (which make up the bulk of what’s known as ‘big data’) from other sources.



The good news is that advances in data acquisition technologies mean that a whole new world of soft data – from downhole geophysics, multi- and hyper-spectral core scanning, and more routine collection of geometallurgical parameters – is now available to inform and enhance resource modelling and estimation.



For example, using soft data can help resource geologists detect correlations between variables that might not be immediately obvious from hard data alone, such as a subtle alteration pattern that is evident from hyper-spectral core scans but not in assay results. Including additional geometallurgical-related parameters, such as hardness or grindability, acid consumption, moisture content, or clay minerals, can also:




highlight potential processing issues or abnormal values that wouldn’t be recognised with a more limited dataset



help define trend surfaces, such as gradual changes in mean values that can be removed from the data to improve the quality of estimates



identify variables to be estimated that might not normally be included in the block models that represent the material to be mined.




Using geometallurgical and other parameters – through using self-organising maps, for example – also contributes to better domaining of the mineralisation. This is because it allows geologists to consider many more characteristics as they define which volumes of material share similar characteristics and which are distinct.



In addition:




Adding in big data — via techniques such as co-kriging using secondary variables — can help produce estimates that take more localised (at a selected mining unit scale) variations in the mineralisation into account, while still:



achieving acceptable slope of regression (a standardised measure of the quality of the estimates)



minimising conditional bias (true value is typically less than the estimate when the estimate is high, and the true value is greater than the estimate when the estimate is low).



If a mining company chooses not to complete, for reasons of time or money, a full analysis of all the attributes of all physical samples, geologists can use big data to fill in (impute) missing values using estimation techniques, proxy formulas, or correlations. Once they have all desirable attributes available for each sample, they can then return to more conventional techniques, such as kriging, to produce estimates or simulations — a set of equally probable realisations of the estimates — for all required parameters.



Incorporating big data as part of the resource modelling and estimation workflow increases the ability for resource geologists to:



highlight areas of higher risk (with, for example, elevated levels of deleterious elements or material with potential processing problems) that could be subject to additional environmental or social considerations



adopt the industry best practice scorecard approach to the classification of the Mineral Resource estimates (from low to high confidence: Inferred, Indicated and Measured)



improve mine site safety by identifying zones that might have poor ground conditions or require a change in standard mining practices to deal with (thereby introducing non-standard or unexpected behaviour).




The bad news



At the same time, however, all this additional information can result in an overload of big data, potentially many terabytes in size, that might also have varying degrees of accuracy and which must be separately validated before it can be used. 



That validation can add substantial time and effort, since the new, non-traditional data must be made to work with — and be stored and visualised alongside — the traditional physical drilling information, such as lithologies and assays, typically found in a geologist’s resource database. Finally, big data also makes (potential) automatic modelling and simulation a complex, processing-intensive task.











So what should mining companies do?



With clear advantages to using big data, despite what it demands in time and effort, miners need to keep in mind that a poor dataset will always produce a poor estimation. A good dataset, taking into account all available data, will produce an estimation that is more statistically sound, with clearly defined reasoning behind each of the decisions made along the way.



To make the best use of all available data, mining companies should consider how they want to address four specific challenges:



1. Storage



The process of acquiring, validating, and analysing the base data for resource estimation is time consuming and expensive, which means miners must consider the value of the information and knowledge derived from that data when determining how they will store it.



They must also decide how long to store it for: it may take years or even decades before a company makes the decision to mine, while the mining operation itself can take place over decades, so the lifecycle of the data is also long. But even data that is decades old can remain valid and useful for analysis/modelling if it is appropriately stored and, most importantly, still available.



Currently, however, geologists often store the initial data they collect during the exploration phase on a laptop, which both limits access to this data by other project teams and increases the risk that the data — and its potential value — could be lost at any time if a geologist changes roles or the device is retired.



2. Multiple sources



Geologists need to be able to retrieve and use information easily, but the sheer range of data now available for geological modelling and resource estimation can make that difficult.



Today, base data comes in a wide variety of types, including lab results supplied directly from Laboratory Information Management System (LIMS) systems and descriptions of the diamond drilling core from which the physical samples were extracted, as well as data from, for example, hand-held/portable X-ray fluorescent (XRF) analysers, data historians that record drilling penetration rates, and metadata — ie those additional details, such as the time of day the data was collected and the person, company, or piece of equipment that collected the data, that are vital for confirming if the data is in the original form, if it has been manipulated or adjusted, or is a calculated average.



This leads to a dataset made up of a diverse collection of text files, Excel spreadsheets, and resource models in proprietary binary format files, alongside data stored in geoscientific information management software packages and core scans, which alone can take terabytes of data, with much of it collected at different times and by different people/equipment.



3. Data lifecycle



Large amounts of data from multiple sources acquired over many years increases challenges to both data domaining (dividing the rock mass into volumes with similar characteristics that are distinct from each other) and the Mineral Resource classification process.



Resource geologists must now consider the lifecycle of the data used in resource classification and find a way to accommodate and flag drilling results and other data with lower confidence (or which failed the validation test), while not losing portions of that dataset, such as lithological/structural interpretations, that could still be used for resource modelling purposes.



Also, as more data is collected, geologists may deem historical data with no or inappropriate quality assurance/quality control (QA/QC) less reliable for use in mineral resource estimation, and must have a way to incorporate this finding into the database to ensure only the highest quality data is used. For example, if newer, more accurate collar/downhole surveys or laboratory analysis methods identify weaknesses in previously collected data, that new data could make the use of historical data (such as lithological contact positions or assay information) inappropriate, depending on how the data is used in the resource definition and estimation process.



The same might happen with biased historical data. Bias usually only becomes apparent and downgrades confidence in the data after a considerable period of time. It is crucial to maintain all metadata so that the data does not have to be revalidated before it is used in each resource update cycle.



4. Database management



In order to manage a resource estimation dataset that includes an array of big data properly, resource geologists need to be able to:




Discriminate between hard and soft data and any metadata that also needs to be included in the resource dataset, and to store their reasons for considering the data suitable for estimation or not.



Maintain the integrity of the resource dataset to ensure that the level of confidence (low to high) in the data can be used to appropriately:



classify the confidence level of the resource estimates



determine the risk profile of the decisions based on those estimates.



Control access to the database to:



ensure that only validated and approved data (as opposed to raw data on which the QA/QC has not been verified) is used in the resource estimation process



identify where other data has been confirmed as suitable only for modelling the geology (such as the extent of the mineralised lithologies) as opposed to estimating the mineral content or other properties of the material to be mined, including waste/overburden.



Provide proof of a strong chain of custody for all data that will confirm, for example, that assay data has not been manipulated. This proof will increase confidence in the estimates during external independent reviews, and illustrate that the dataset is being well governed — a vital consideration for financing.




The future is in the cloud



While these are significant challenges, they are not insurmountable, and the future for resource modelling and estimation is, in my opinion, in the cloud.



A cloud-based platform:




removes storage limitations and allows for on-demand access to both data and processing power



ensures high availability, which can replace current back-up and disaster recovery processes, except for those that are time-sensitive and can affect mining/production



provides a central location to store and share all data, with sufficient on-demand computing resources available to accommodate repeatable workflows rather than a collection of independent, difficult to back-up processes run on separate devices with limited processing power



offers the option of using standardised workflows to capture deep specialist knowledge, which then becomes permanently retained, role-based knowledge



enables processes that depend on access to powerful computing resources to be run more efficiently, both in time and cost, than using local machines with limited capabilities — for example, with a cloud-based computing platform, it becomes much more feasible to routinely undertake valuable studies when new data becomes available, such as simulating variations:



in the geology model and the resource estimates, and then building multiple mine plans based on these variations, or



in the beneficiation process when handling ore with differing chemical characteristics or ratios of ore types



makes it possible to:



quickly incorporate artificial intelligence and machine learning techniques in workflows to automate a number of time consuming and/or repetitive tasks



construct workflows to produce financial models that incorporate much more of the underlying inherent variability of the mineralisation as opposed to those based on average assumptions — making true risk-based decisions using robust confidence intervals placed on key metrics, such as Net Present Value or Internal Rate of Return, possible.




In short, by being able to store, process, integrate, share, and display all available data types required for high-quality resource modelling and estimation, a cloud-based platform will contribute to overall improved orebody knowledge and understanding of the controls on mineralisation.



This in turn will result in significant downstream benefits, including better blending of material for processing, more consistent plant throughput, and ultimately, most importantly, higher product quality and increased profit.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Building Efficient and ESG-Compliant Mines with Virtual Twins ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/building-efficient-and-esg-compliant-mines-with-virtual-twins/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/275082</guid>
      <pubDate>Thu, 12 Dec 2024 13:13:45 GMT</pubDate>
      <description>
      <![CDATA[ In order to supply demand, the mining industry must discover more copper deposits, increase inventory, and accelerate production. The challenge lies in developing mines quickly while adhering to ESG norms, environmental regulations, and community stakeholder buy-in. Integrating technology across the mining value chain can reduce uncertainty and expedite this process. 
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
By Gustavo Pilger, Director &#8211; Worldwide GEOVIA Research and Development Strategy and Management, Dassault Systèmes.



Fossil fuels currently meet about 75% of global energy needs. To transition to renewable energy sources by 2050, significantly larger quantities of minerals, especially copper, must be mined.&nbsp; The current rate of production will not suffice to meet demand. With only 800 million tons of known copper reserves, is it possible to supply this demand? If it is possible, can it be done it with minimal environmental impact while adhering to ESG norms and public scrutiny?



In order to supply demand, the mining industry must discover more copper deposits, increase inventory, and accelerate production. The challenge lies in developing mines quickly while adhering to ESG norms, environmental regulations, and community stakeholder buy-in. Integrating technology across the mining value chain can reduce uncertainty and expedite this process.



Avoiding silos in mining sub-processes with technology



Processes and sub-processes in mining typically occur in silos, leading to frustration among stakeholders due to inefficiencies that often lead to disruption and rework. Engagement is key to foster collaboration that can break down silos and bring teams together toward achieving common goals. It is important to communicate effectively in order to take stakeholders on a journey that results in common understanding of contexts, objectives, constraints, risks, planning, and risk mitigation measures to achieve a given target. This could be greatly facilitated if all stakeholders work in a collaborative manner on a unified platform with consistent data models and user experience.



Mining could be seen as a series of interconnected systems, or a ‘system of systems,’ that includes several processes from securing permits and exploration down to development, production, beneficiation, sales and distribution, decommissioning, and site rehabilitation. How do you connect systems, technology and people to ensure this system of system “machine” works in a synchronized and harmonious manner while constantly chasing value and common KPIs? A first logical step is to clearly delineate the boundaries of these systems and identify how data flows or how it should flow within them.



These system of systems framework must recognize a constellation of systems and must understand that the output of one system is the input of an adjacent one. This framework must also be aware of the decisions made within each system’s boundaries as well as the implications or consequences to upstream and downstream processes. Virtual twin experiences provide this framework.




“Dassault Systèmes’ solution for sustainable, energy-efficient mining that meets environmental norms and revenue targets centers around virtual twin experiences.”








Virtual twins help embed sustainability in mining operations











The virtual twin provides a live, virtual replication of the real world, where processes and systems are interlinked and associated with one another. This includes the underlying data that informs and describes these processes, all interconnected from a multiphysics, multiscale, and multidisciplinary perspective.



With virtual twin experiences, associated data and intelligent methods help mining organizations pursue value throughout their operations, adapting to uncertainty and unplanned events of technical, mechanical, or market origins.



Can you change the way you operate by designing mines that extract more minerals more efficiently, while minimizing energy use, and complying with ESG standards and environment norms? Yes, the virtual twin can assist in this transformation. It can model and simulate many possible outcomes, environmental and social scenarios, balancing efficiency and costs with ESG norms, and while maximizing safety and value.




“By connecting various data and systems, the virtual twin creates a unified collaborative environment, allowing stakeholders to identify priorities and measure performance against benchmarks for responsible and sustainable growth.”




The virtual twin also enables the management to control permit status, asset agreements, asset licenses, and associated cost analysis to ensure that everything proceeds according to plan.



The virtual twin ensures continuity between the natural environment, claim boundaries, and built infrastructure. Users can leverage immersive visualization with spatial contextualization to gain a comprehensive view of data for actionable insights. This ensures sustainability concerning energy and emissions, water, and biodiversity. Environmental, social, regulatory, and sustainability KPIs can be measured against benchmarks using the virtual twin.



The virtual twin also provides visibility into data with powerful integrated analytics and geospatial data. It allows the aggregation and propagation of data on land stewardship in line with company, compliance, and regulatory frameworks.




“Digital communities created through the virtual twin facilitate two-way data and model sharing, ensuring a unified user experience for all stakeholders.”








Why virtual twins are essential for energy efficiency in mining



Mining is one of the most power-intensive industries. Collectively, mining consumes about 11% of the world’s energy, primarily from fossil fuels. Therefore, powering mining and related infrastructure with renewable energy sources is essential for sustainability. Renewable energy also offers opportunities for cost savings, innovative mine designs, and resilience against uncertainties.




“With the virtual twin, mining energy systems can be designed as a single platform, allowing for the design and simulation of energy supply and sources.”




Many mine operators are likely to adopt a hybrid approach to energy. The virtual twin supports a hybrid approach to energy efficiency in mining by using advanced controls that enable miners to design, simulate and visualize optimal energy configuration regimes while reducing overall operating costs, therefore de-risking the electrification process.



In short, virtual twin experiences should play a major role for building the next generation of copper mines and to attract an ever-increasing digital-native workforce in order to achieve sustainability goals and ESG mandates. Virtual twins are not only essential for de-risking mining projects from a multidisciplinary perspective but also for navigating business complexity while ensuring sustainability. Ultimately, virtual twin experiences are a fundamental communication tool for engaging communities where mines operate as well as governments, regulatory agencies, shareholders, and employees from the outset while fostering sustainable mining projects with shared interests.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Improving Geological Modelling in the Age of Data Overload ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/improving-geological-modelling-in-the-age-of-data-overload/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/275067</guid>
      <pubDate>Thu, 12 Dec 2024 12:06:05 GMT</pubDate>
      <description>
      <![CDATA[ However, while today’s big data and advanced technology can help geologists generate more accurate models than ever before, there are still challenges to overcome in using all that data and technology successfully. Fortunately, there are also solutions.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Authored by Jacques Nel, GEOVIA Senior Industry Process Consultant.



Geologists face many challenges in dealing with the unprecedented amount of geoscience data available today. In this article, Jacques Nel discusses software developed by Dassault Systèmes with the potential to help mines to rise above the chaos and use all of their data to generate more accurate geological models that give them a critical strategic edge.



A geological model is a specific representation of the location, characteristics, and extent of the lithology and ore types of a mineral deposit. It is based on the knowledge of the geologist and on data gathered through such sources as geological field observations, drillhole records, geophysical surveys, and assays. It is a vital input for both the resource model and mine planning, and affects virtually every decision made throughout the mining process.



To put it another way, the success or failure of a mine largely rests on the shoulders of the geologist responsible for the original geological model.



However, while today’s big data and advanced technology can help geologists generate more accurate models than ever before, there are still challenges to overcome in using all that data and technology successfully. Fortunately, there are also solutions.



Challenge 1: Importing and managing the data required for an accurate geological model



Geoscience data is key to any technical or financial evaluation of a mineral deposit because it helps define the site, shape, and grade of the orebody.



Today, however, geologists are inundated with so much new geoscience data — structural, geochemical, lithological, remote sensing, etc. — that they often don’t know what to do with it all. At the same time, valuable historical data may be stored on a number of different servers, on staff members’ personal laptops, on outmoded disks or drives, or even in long-forgotten physical filing cabinets. There may also be so many different versions of some datasets that it’s impossible to know which is the latest.



All of this can lead to a lack of confidence in the data used for modelling and downstream decision-making processes, and/or to mines using valuable time and money to re-explore or re-acquire lost data. It also makes it difficult for mines to comply with Environmental, Sustainability, and Governance (ESG) regulations that call for them to become more transparent in how they acquire, store, manipulate, and use geoscientific data.



Challenge 2: Visualising and analysing the data required for an accurate geological model



Data visualisation gives geologists insight into their geoscience information and helps them identify geological trends or patterns as well as erroneous data. Data visualisation, however, like data management, has been made much more complicated by the vast quantities of data available today from both exploration and operational projects — most of it in different data types from different sources in a variety of file formats.



Those different sources and formats make it difficult for geologists to visualise and analyse their data in one application, meaning they could miss important trends or patterns, while the workflows they have access to now to integrate separated data may be too time-consuming or unreliable to use. The result is that valuable geoscientific data may be inadvertently excluded from the geoscientific analysis, reducing the accuracy of the geological model.



Challenge 3: Interpreting and modelling the geology of a mineral deposit



Throughout much of the world, regulatory bodies require geologists to adhere to strict standard operating procedures and regulations when generating geological models. This is not easy considering:




the number of tasks that must be completed to generate a model



the amount of data linked to each task



the timeframe allocated for each task, and



the complexity of keeping track of tasks, data, and timeframes.




But failing to adhere to these operating procedures and regulations is not an option, because it could lead to non-compliance, which would have a negative impact on the mine and/or mine company.



Having data stored in multiple repositories in many different file types compounds this challenge by forcing geologists to interpret data in several separate applications, which once again (see Challenge 2 above) may lead to them missing critical trends or patterns, and then producing incorrect interpretations and inaccurate geological models.



Challenge 4: Validating and sharing a geological model



Once a geologist has interpreted geoscience data and created a base geological model, there are a number of different methods they can use to ensure it has been generated as accurately as possible. These include:




doing a geostatistical simulation to construct geological scenarios that can test or quantify the uncertainty of the model



conducting a visual analysis of the domains compared to the actual data



using query filters to test the model



making a direct comparison of the new geological model to the previous model, and/or



sharing the model across the organisation for collaborative peer review.




The challenge for many geologists right now, however, is that they may have only a single technique or application they can use to validate their model, which means they may miss certain issues that can render the model — and all the critical mine planning decisions made based on it — inaccurate. This in turn can lead to imprecise mining optimisations, possible safety issues, and less-than-expected production.



Sharing the model for peer review may also be difficult. Departments in the mining industry have traditionally acted in silos, which has made cross-departmental collaboration challenging. 



Many also still use email or flash drives to share important files, which makes it hard to be sure which version is the latest, and not all departments may have access to applications where they can view or statistically analyse the model, and acquiring new licences can be costly.



Addressing the challenges



Overcoming all four of these challenges — the need for secure, effective data management; comprehensive geological visualisation and analysis; correct data interpretation; and accurate model validation — begins with ensuring that all geoscience data is integrated, stored, analysed, interpreted, and managed in a single, platform-based centralised repository.



This centralised repository will ensure that the latest, most complete version of any data becomes the single source of truth for everyone to reference. 



And it has the added bonus of helping to minimise the financial cost of obtaining geoscience data by increasing the percentage of it actively used in creating models.











One example of a platform-based centralised repository is Dassault Systèmes 3DEXPERIENCE platform. To assist with geological modelling, we have also developed a customised Geology Modelling repository, where all of a mine’s geoscience data is first securely stored on the platform, either on a mine company’s own premises or on a public or private cloud. 



From this central hub, industry-proven applications, tools, and workflows make it far simpler for geologists to locate, interpret, display, and analyse their data, as well as to create, validate, and share geological models.



It works like this: The 3DEXPERIENCE platform connects to both GEOVIA Surpac geology and mine planning software and ENOVIA project and document management software. 



This enables geologists to drag and drop any data (drillhole, topographical point cloud, geophysical, assay, geotechnical, etc) held on the platform in any format into the GEOVIA Surpac graphics window to begin work immediately — no data conversion or lengthy import or export processes required. 



This kind of integration also has the benefit of automatic document versioning, data check-in/check-out, and user file and folder permissions to ensure traceability and accountability.



The software combination also gives geologists the ability to:




synchronise data for fast 3D visualisation



apply geological reasoning and logic to sculpt domain solids from all available data



create and compare various interpretations for “what if” scenario analysis to quantify the natural uncertainty



track the entire evolution of the geology model from data interpretation to model generation and validation



create a project plan, assign tasks, and monitor project progress (and any issues or bottlenecks) against the plan using a variety of visual methods, and



share the 3D geological model and any statistical reports and charts with stakeholders through customisable dashboards or communities.




The result is improved compliance with Standard Operating Procedures, ESG regulations, and the mine’s own KPIs. It also allows greater confidence in the quality of the geoscience data and the accuracy of the geological model, and better decision-making throughout the life of the mine.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ A Comprehensive Look Back at METEC India 2024: Advancing Green steel ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/industries/infrastructure-energy-materials/a-comprehensive-look-back-at-metec-india-2024/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/274667</guid>
      <pubDate>Fri, 06 Dec 2024 04:57:40 GMT</pubDate>
      <description>
      <![CDATA[ Explore the highlights of METEC India 2024, where we showcased innovations shaping the future of green steel and technology tailored for the steel industry.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
METEC 2024 India provided the perfect platform for Dassault Systèmes to showcase groundbreaking advancements in technology tailored for the steel industry. Focused on the evolution of &#8220;Green steel,&#8221; the event underscored significant strides toward sustainability, efficiency, and digital transformation in steel production. From the use of AI to real-time data analytics and Virtual Twins, the demonstrated innovations provided a clear glimpse into the future of steelmaking.











A Landmark Event in Metallurgy and Steel Innovation



This year’s METEC India stood out for its focus on addressing critical industry challenges and opportunities. Key highlights from the event include:




Technological Innovation: METEC 2024 provided a platform to introduce groundbreaking technologies aimed at transforming the metallurgy and steel sector. From efficient production solutions to advanced materials, the range of innovations presented was impressive.



Global Participation: The event saw significantly increased engagement from both national and international stakeholders. Industry leaders, experts, and organizations came together to exchange insights and collaborate on shared goals.



Focus on Sustainability: A central theme of the event was sustainability, with numerous exhibitors and speakers emphasizing the need for environmentally-conscious practices. Innovative solutions for sustainable production and environmental conservation were prominently showcased.



Insightful Discussions: Conference sessions and booth discussions provided deep insights into the future direction of the metallurgy and steel industry. Emerging trends, market dynamics, and the integration of advanced technologies were at the core of these conversations.




Attendee Feedback Highlights



Feedback from participants reaffirmed the event&#8217;s success and impact. Attendees expressed appreciation for the following:




Innovation and Relevance: The event’s emphasis on introducing new technologies was widely praised, with many attendees highlighting their relevance for driving industry advancement.



Engaging Networking: Participants commended the opportunities for creating valuable partnerships and exchanging ideas with like-minded professionals.



Focus on Sustainability: The showcased sustainable solutions and strategies struck a chord with the audience, reflecting the industry’s growing commitment to environmental responsibility.



Comprehensive Digital Transformation Strategy: We presented with industry leaders to unveil a digital roadmap for the steel sector, integrating advanced supply chain strategies for increased productivity and a reduced environmental footprint




The enthusiastic reception of our technology demonstrations highlighted a collective commitment to shaping a more innovative and greener steel industry.



A Shared Commitment to Green Steel



The steel industry is entering a new era that balances the demands of productivity with the imperative for sustainability. At Dassault Systèmes, we are proud to play a pivotal role in this transformation by empowering businesses to adopt technologies that pave the way for a brighter, greener future.



To all stakeholders—steelmakers, innovators, and environmental advocates—we invite you to collaborate with us in realizing the full potential of these advancements and creating a lasting impact. Together, we can reshape the future of steel into one that thrives on innovation, efficiency, and responsibility.



Thank you for joining Dassault Systèmes at METEC 2024. We look forward to continuing this exciting conversation.



Author: 











Sanjeev R &#8211; India Industry Business Consultant Senior, Dassault Systèmes India




 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Unlocking Value in Mining ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/unlocking-value-in-mining/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/274447</guid>
      <pubDate>Mon, 02 Dec 2024 11:59:53 GMT</pubDate>
      <description>
      <![CDATA[ The bottom line is that a centralised system deployed either on premises or on cloud would make the execution of the company’s Disaster Recovery Plan way more straightforward compared with decentralised systems.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Gustavo Pilger, Dassault Systèmes, GEOVIA on the critical role of centralized data management for efficiency, security, and innovation in mining operations.



The rapid growth of data, driven by technological advancements, presents both benefits and risks. Consolidating and centralizing data management is important to enhance efficiency, cybersecurity, and business intelligence. We discuss the critical challenges and opportunities in managing data within the mining industry and also explore the role of Dassault Systèmes&#8217; 3DEXPERIENCE platform in enabling mines to optimize processes, ensure data security, and improve disaster recovery capabilities.



Q) Please outline the key challenges that the mining industry faces in managing, protecting, and storing data.



Data is part of the IP portfolio of a company (together with a range of assets).  Therefore, it should be managed as any other valuable asset. Over the last 2 decades in particular, with technology advancements and the advent of a range of sensors, we have seen an &#8220;explosion of data&#8221; across industries including mining. 



This brings opportunities and challenges at the same time. The opportunities are mainly associated with the potential to better understand processes enabling one to improve them with productivity and efficiency gains that often lead to cost savings.



To achieve this state, however, one needs to overcome a few challenges: from navigating through a plethora of data for extracting knowledge to cybersecurity risks that could expose corporations to significant financial losses. The ability to count with a range of data to unlock or optimize mining processes is great. 



However, one of the first challenges is to consolidate the data that is often captured and stored in different systems. Not only these data are stored in decentralised (local) disparate repositories, but these systems are administered by different people with different levels of responsibility and awareness when it comes to data integrity and related risks. 



So, it is important that data is properly stored and managed in a way that allows one to extract the most knowledge out of them while preserving its integrity and exposure.



Q) How should mining companies approach consolidating and centralizing their data management to enhance data security?



The first step towards data consolidation is to compile a data inventory across the mine including information about type, format, purpose, frequency of change, etc. This allows one to map out the data flow intra- and inter-processes across the mine to then assess what matters the most and where potential bottlenecks are in order to prioritise where to begin. 



Therefore, understanding the data ecosystem together with the impact they have across KPIs is key to drive change in this space.



All sorts of data are being collected from a range of equipment (including sensors) across the environment of a mine. Together with good, valuable data also comes noisy data &#8211; and lots of them.  



Therefore, ideally, the data collected across the mine not only needs to be federated (or consolidated), but also needs to be indexed, sanitized (filtering out the noise), and contextualized so that meaningful insights can start to be extracted for decision-making.  



This could be achieved with the adoption of a centralized system that allows ingesting data collected by equipment across the mine, as well as their management in a safe and secure environment. The Dassault Systèmes 3DEXPERIENCE platform offers this solution.



Q) What critical benefits do mines gain from centralizing their data management?



I think the ultimate benefit is about being in control of the data instead of data taking control! One can only improve what is measured and understood! 



A centralized platform that allows data federation, indexation, 3D contextualization, analytics and action management, all in a secure environment, puts you in control of your assets allowing to extract the most value out of them.




Also, typically with decentralised systems, a great amount of time is dedicated to finding the right data or the latest version of data to work with. This translates to enormous inefficiencies, errors, re-work, and frustration leading to employee disengagement creating a vicious cycle of inefficiency. On the other hand, a centralised system, with rigorous access control processes, eliminates these inefficiencies. 




Every employee has access to the right data, in terms of permissions and versioning for conducting his/her work. Every decision taken by employees is recorded and justified within the system providing an inherent layer of traceability and auditability. Other benefits include de-risking data integrity and exposure.



Q) Tell ua about the role of centralized data management in improving data analytics and business intelligence, and how this benefits mines and their personnel.



GEOVIA, a Dassault Systèmes brand, provides software tools that allow our mining clients to model and simulate processes and how they interact with adjacent (connected) processes before anything is actually built, in early project development phases, or to correct the train of action on projects already in production in order to keep chasing value while operating.



Since the underlying data is federated, indexed, standardized and contextualized in a safe and secured single repository, and systems are connected with input and output associated through common data models, one can test multiple hypotheses or scenarios in the virtual world (Virtual Twin Experience) to efficiently apply a given design or plan in the real world &#8211; eliminating unnecessary waste, reducing risk, minimizing material re-handling while maximizing productivity! 



Data is not only safe and secured, but it is indexed (for quick retrieval), standardized through semantic dictionaries and contextualized, enabling meaningful link and associativeness between processes and data.



It is this data associativeness combined with smart methods and algorithms that allows one to constantly chase value while in operation, adjusting to (previous) uncertainty and unplanned events (being of technical, mechanic, or of market nature). 



I’d like to emphasize that having this core data, industry knowledge and know-how supported by semantic dictionaries (ontologies) central to our business platform (3DEXPERIENCE) that is built on a multi-physics and multi-scale foundation allows us to go beyond Generative AI and Large Language Models (LLMs). 



With this core set of characteristics, what we offer instead is Industry Language Models (ILMs) that indeed leverage LLMs but are combined with ontologies and industry knowledge and know-how within a platform environment (3DEXPERIENCE) that inherently provides governance and traceability.



Q) Please explain the ways in which centralized data management enhances a mine’s disaster recovery capabilities and why this is critically important?



A&nbsp;decentralised data management system, with data fragmented and scattered across the corporation, would need to rely on systematic discipline by personnel in charge to regularly back up local stored data, which could be a challenge by itself. Therefore, it&nbsp;would make it really hard (if not impossible)&nbsp;to fully recover should a disaster were to occur.



Instead, a centralised system can be restored in a matter of hours in case of disaster. Of course, assuming appropriate levels of redundancy, training and protocols would be in place to allow minimum levels of disruption in case of disaster.



The bottom line is that a centralised system deployed either on premises or on cloud would make the execution of the company’s Disaster Recovery Plan way more straightforward compared with decentralised systems.



Q) Ultimately, how does centralizing data management improve both a mine’s cybersecurity and the safety of its employees?



Data centralisation enables to significantly reduce risks associated with data integrity and cybersecurity. Consolidating the data in a single repository reduces the risk of losing or corrupting data that otherwise would reside in local drives of desktop computers located across mine sites, or into laptops of those employees required to work on the data. 



Instead, on a centralised system such as the 3DEXPERIENCE, the right version of the right data is available at any time to the right people. Since 3DEXPERIENCE counts with a rigorous access control process, this means that data is made available to employees according to their roles and needs. 




For example, a Surveyor does not need access to sensitive data such as gold grades from core logging, while a Resource Geologist needs it as it is required for him/her to conduct their work. So, all this combined mitigates quite significantly risks associated with data integrity, exposure and cybersecurity.




For those who choose to embrace the cloud to store and manage data via a cloud provider, be assured that the risks are well managed. Risks are arguably better managed than in in-house data centres. 



This is because most cloud vendors, such as Dassault Systèmes, operate with heightened security practices tailored towards protecting their infrastructure, applications, and customer data. A good cloud provider will adhere to industry standards and best practices that include:




IOS 2700x standards, and in particular implementation Guide ISO 27002



NIST 800 series



OWASP (Open Web Application Security Project) methodologies



CobIT framework




Also, good cloud providers employ multiple, independent and redundant mechanisms at various levels to block attacks. These measures provide far better security than most organisations can provide for themselves.



Therefore, in terms of risk management, it is a win-win proposition for all, including corporations, employees, contractors, and customers.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Fueling the Future of Energy ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/netvibes/fueling-the-future-of-energy/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/274041</guid>
      <pubDate>Thu, 28 Nov 2024 08:47:03 GMT</pubDate>
      <description>
      <![CDATA[ Discover how a leading company in the energy sector is powering a carbon-neutral future through electricity with NETVIBES data science solutions on the 3DEXPERIENCE© platform.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
CHALLENGE



A multinational leader in the energy sector found that data silos were making it difficult to manage costs, make accurate, timely decisions and avoid project delays. The company needed a 360-degree view of its assets, along with analytics and monitoring dashboards to help strengthen plant performance. In addition, it wanted a single source of truth that would help it optimize its engineering projects and boost overall competitiveness.



SOLUTION



The company chose NETVIBES data science solutions on the 3DEXPERIENCE platform to help it manage its capital projects with precision. Asset Information Intelligence solutions provide it with a single source of truth, powerful data analysis and real-time insights, along with enhanced collaboration capabilities.



BENEFITS



Users are able to make better decisions, backed by the insights that come from connected data analytics. Real-time visibility on program data, coupled with a dashboard view that contextualizes knowledge, documents and resources, has enabled the company to drive more efficient project management, accelerate project delivery and reduce construction and operating costs.







Paving the Way for Sustainable Power



A carbon-neutral energy future through electricity – that is the mission of one multinational leader in the energy sector. To help achieve it, the company is building and operating several energy plants.



In capital projects like these, even the smallest problem can cause long delays and cost overruns – all of which deter customers and governments, whose backing is crucial. However, it was difficult for the organization to avoid these issues because project data was spread across many different systems and directories. This caused a lack of visibility over millions of complex data items for equipment to be manufactured, installed and tested. It made it difficult for engineers to identify issues and next steps, or to see the impact their decisions would have in other areas like purchasing, subcontracted studies or on site. As a result, previous projects had encountered delays and extra costs.



Breaking down those data silos was essential to optimize the company’s control of engineering process. As well as bringing its data together, it wanted to be able to analyze it and visualize it in a way that would help project managers:




understand how to deliver each project on time, on budget and on specification,



draw on past experience to continuously improve planning and minimize errors,



identify potential risks ahead of time and take effective measures to reduce them.




The company had previously implemented NETVIBES data science solutions to provide change management dashboards and 360-degree views of assets. Next, it chose NETVIBES Asset Information Intelligence solutions on the 3DEXPERIENCE platform, to create a virtual twin of its program management that would incorporate key performance indicators (KPIs) and support precise dashboards to show project progress.



A Single Source of Truth



At the heart of the solution, NETVIBES integrates data from the company’s main IT systems and delivers a complete, dashboard view of assets and construction progress. It orchestrates all the information around an object, asset or process – including 3D and 1D drawings, metadata and documents – and makes it available in the context the user needs.



NETVIBES collaborated closely with the organization to develop a proof-of-value application that combined on-premise data in the 3DEXPERIENCE platform with data science capabilities in the cloud. This allowed the company to see how the solution would work, and how it could meet both their current and future project visibility needs.



In particular, the solution stood out from competing offers because it allows the company to connect natively to all its data and gain real-time insights from it. Collaboration would be easier, since the 3DEXPERIENCE platform was designed to help disparate teams work together.



Mastering Requirements Management



Making the data intuitive has made all the difference, and that is where the solution’s semantic graph index (SGI) cloud comes in. It makes it possible to manage and cross-examine data from different sources and present the information in a way that makes sense to individual users and their job role. Having instant access to the right knowledge, processes and documentation has helped the company to enhance its capital project management in several ways.



One priority in the client roadmap is to be able to project KPIs onto the virtual twin, in effect building them into the project from the start. Users will be able to analyze the maturity of requirements and explore how they relate to objects in the system, in alignment with business rules. This would enable the organization to improve its management of requirements monitoring, carry out basic and detailed design reviews and validate the different construction phases of its energy power plant projects.



Historical analysis of requirements could also help users to respond more efficiently to any volatility in requirements. By factoring in the way requirements have evolved during past projects, the solution allows them to draw on the organization’s experience and anticipate future developments.



Intelligent Equipment Assignment



Assigning equipment is one area of project management that should be simple. In reality, it often involves project managers wading through details to identify the appropriate equipment and request it for their project – only to find that the piece they wanted is already in use elsewhere.



NETVIBES simplifies this process by bringing together all the data surrounding the equipment – its 3D representation in the 3DEXPERIENCE platform and the metadata that resides elsewhere. Each piece is cross-analyzed by family, metadata and zoning so that it’s easy to identify the most suitable components and apply them to the project.



With this information provided in a single, dashboard view, the project manager can filter out any equipment that is already assigned before they start their search. Then they can explore the available pieces to find the ones that are suitable for their current project. Once that’s done, they simply select the equipment they need and assign it by dragging and dropping it into the relevant task on the dashboard.



Optimizing Project Control



Having a single source of truth has empowered the company to improve its control of capital projects. For example, threads such as construction, engineering and sourcing all happen on different timelines throughout a project, but they are all critical when it comes to meeting deadlines and budget commitments. NETVIBES puts all these threads into context so that managers can anticipate and control project delays.



Predicted delays for ongoing projects are explained using similar past cases and recurrent behaviors that have been identified. By visualizing recurrent behaviors through time, the solution also indicates whether they are still relevant or have already been addressed – a crucial distinction for project managers who are deciding on their next steps.



As a result, the organization can create project plans much faster and optimize them in terms of objectives such as time, risk and resources, while aligning with established best practices and standards. Risks are minimized because the solution uses previous capital projects to identify them early on, so project managers can generate a timely mitigation plan.



Powering a Carbon-neutral Future



By using NETVIBES to break down silos and integrate data on the 3DEXPERIENCE platform, the company has created a single source of truth that helps it to keep capital projects on track. The real-time insights and enhanced collaboration that the solution provides are driving more efficient management and tighter control, enabling faster delivery of projects and reduced construction and operating costs. As the organization looks ahead, these capabilities are helping it to build the foundation for more sustainable power provision.



“It is critical for managers to have access to all information generated by an energy power plant project, including pumps and all data for a given asset,” said Morgan Zimmermann, CEO of Dassault Systèmes NETVIBES. “As a single source of truth for all data in the platform, NETVIBES empowers the company to make better project decisions. In addition, other stakeholders can see project status at a glance, without having to dive deep into the platform.”




It is critical for managers to have access to all information generated by an energy power plant project. As a single source of truth for all data in the platform, NETVIBES empowers the company to make better project decisions.
Morgan Zimmermann, CEO of Dassault Systèmes NETVIBES



Learn More Here



Download the eBook&nbsp;to discover more NETVIBES data science solutions in action!




 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Generating Production Scenarios ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/generating-production-scenarios/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/273287</guid>
      <pubDate>Tue, 19 Nov 2024 11:58:16 GMT</pubDate>
      <description>
      <![CDATA[ Relying on Dassault Systèmes’ Delmia production simulation tool, which can create 3D models of specific, discrete events, the company developed thousands of scenarios to visualise how equipment would move and interact under various mine designs.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
In this series of articles so far, we have illustrated how an international copper mining company went about finding out the answer to a simple question: would it increase productivity and reduce operating costs of they changed the design of a new block-caving project to accommodate larger load-haul-dump (LHD) machines?



Now we are ready to tell you what the company discovered.



After using Dassault Systèmes’:




parametric design tool to test the effect of the larger LHDs on their original mine design in a number of key parameters, such as tunnel section and spacing, pillar size and undercut and extraction level elevation



PCBC mine planning software and other tools to automatically analyse selected parametric designs, calculate the economic reserve, and create a summary of average copper value, average economic value, and total tonnes extracted, and



Abaqus geomechanical simulation software to analyse the geotechnical aspects of the deposit,




the company determined that the extraction strategy they had been pursuing in their selected designs was too fast. It was clear that they would need to extract material much more slowly into order to allow enough time for the cave to mature and propagate.



With that vital decision made, and new designs with a longer time period generated, the mine was ready for the final step: production simulations.



Production challenges



Relying on Dassault Systèmes’ Delmia production simulation tool, which can create 3D models of specific, discrete events, the company developed thousands of scenarios to visualise how equipment would move and interact under various mine designs.



This video,&nbsp; for example, illustrates how a loader might move to a draw point, gather the material, and dump it into the ore pass.











This video of an open pit operation shows another type of equipment simulation (the basics are the same for block caving) available through DELMIA.











Simulations like these enable mine designers to evaluate a range of production-related issues, including equipment movement, and, in the case of the copper miner we have been focussing on through this series of articles, the effect of bigger or smaller equipment:




Do the roads get more congested with larger LHDs?



Will larger LHDs allow the mine to produce what they expect to produce?




Results



After the copper mining company ran these final production simulations, they had all the answers they needed to determine whether they should stick with their original design for a new block-caving project, or change it to accommodate new, larger LHD machines.



The larger LHDs won the day.



Summary



Through parametric design, automation, and simulation — which took a matter of hours rather than weeks to accomplish — the company was able to create a final mine design that took into account:




optimal tunnel spacing, heights, and other parameters, including the offset between the tunnels and a geological structure at one end of the mine site, as well as draw-bell spacing and entry angles, ore pass locations, load elevations, etc.



the best possible economic reserve, average copper value, average economic value, and total tonnes extracted



geotechnical issues associated with the site in order to reduce the risks of poor caveability, large subsidence, air blasts, etc., and



the most beneficial extraction strategy for the site.




The result was a smaller mine than the copper company originally designed, with a significantly slower extraction strategy than they anticipated.



Instead of being disappointed with these results, however, the company was delighted.



Created with Dassault Systèmes software and expertise, the new, smaller mine design for the block-caving site is estimated to:




improve productivity by 20%



reduce operating costs by 10%, and



allow the company to save around $700 million US on the budget authorized for the original block-caving project with smaller LHDs.








Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Using Simulation to Complete Advanced Geotechnical Analyses ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/using-simulation-to-complete-advanced-geotechnical-analyses/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/273267</guid>
      <pubDate>Tue, 19 Nov 2024 11:50:06 GMT</pubDate>
      <description>
      <![CDATA[ Block caving requires a very large deposit, with sufficient height and footprint area, to be cost effective. It should also ideally include certain geotechnical characteristics, such as pre-existing rock fractures to speed fragmentation and enough rock mass strength to support extraction tunnels.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Authored by Daniel Villa, GEOVIA Industry Process Consultant Expert.



In Article 1 of this series, an international copper mining company wanted to find out if it would be possible to increase productivity and reduce operating costs at a new block-caving project by changing their original design to accommodate new, larger load-haul-dump (LHD) machines.



Using a parametric design tool from Dassault Systèmes, the company tested the effect of the larger LHDs on their original mine design in a number of key parameters, such as tunnel spacing and undercut and extraction level elevation. From there, as described in Article 2, they used Dassault Systèmes’ PCBC mine planning software and other tools to automatically analyse selected parametric designs and, for each design, to:




generate new draw point distribution



create draw columns, based on block model and grade distribution data, to suit each tunnel spacing



run best of height draw (BHOD) simulations to estimate economic mineable reserve, and



calculate the economic reserve and create a summary of average copper value, average economic value, and total tonnes extracted — with all physical and economic parameters mapped to, and captured in, the DOE.




After the company’s mine planner selected the favourite scenarios, based on economic results, it was time to use simulation to complete advanced geotechnical analyses.



Block Caving’s Geotechnical Challenges



Block caving requires a very large deposit, with sufficient height and footprint area, to be cost effective. It should also ideally include certain geotechnical characteristics, such as pre-existing rock fractures to speed fragmentation and enough rock mass strength to support extraction tunnels.



But those characteristics can be hard to assess, and there is always the risk that the deposit will simply be too solid to cave or that it will collapse unpredictably, making it difficult to extract the ore efficiently and also potentially hazardous for workers and equipment. In addition, mines do not want to design a block caving project that will lead to air gaps during cave propagation, which can cause dangerous air blasts.











Geomechanical Simulation



Geomechanical simulation can help mines understand factors such as:




whether there is enough ground support for a block-caving project



if a particular design will result in subsidence and a large crater on the surface, and



how large the economic recovery might be.




The copper company was particularly concerned that, while it looked like they had a good formation of high-grade material within their proposed mine design, there may be problems with vertical cave propagation. Specifically, they feared that the formation may not be able to propagate fast enough to avoid air gaps, which would have a major effect on extraction level location and extraction strategy.



In order to better understand and calibrate cave-back stress and the potential for breakthrough, the company’s mine designer took the draw points identified in Dassault Systèmes’ PCBC mine planning software and sent them to our geotechnical simulation software, Abaqus.



Watch Video Here



From there, Abaqus ran a series of scenario simulations using a variety of inputs — including swell factor, friction angle, cohesion, strain value, and principal stresses — to reveal the answer to what could be a billion-dollar question: where the limit between broken and solid material should be located.



















Results



For the copper company, the geotechnical simulations revealed that the extraction strategy they had been pursuing was wrong. It was too aggressive and would generate a large air gap very quickly.



As a result, the designer changed the strategy to pull material more slowly and allow enough time for the cave to mature and propagate — a change that would have a major impact on NPV.



This graphic shows the difference in NPV between two possible extraction strategies:



“Investigating Economic and Risk Metrics Using Design of Experiments in Fully Coupled Caving Geomechanics Simulation” (Caving 2022. S. Arndt, D. Villa, F. Khodayari, B. Ndlovu.)



What comes next



The final article in this series looks at how the real-life copper company evaluated thousands of production scenarios before arriving at a design that would create a smaller, but significantly more productive underground mine than its original mine design.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Integrating Parametric Design with Mine Planning ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/integrating-parametric-design-with-mine-planning/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/273168</guid>
      <pubDate>Mon, 18 Nov 2024 10:42:59 GMT</pubDate>
      <description>
      <![CDATA[ To make integration simple, Process Composer eliminates the need for a designer to manually transfer the parameters and results from the parametric design into a mine planning package. Instead, it works with PCBC to handle the output variables of the design as input variables for the mine plan, allowing for an automated workflow throughout the entire mine planning process and between multiple applications/software.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Authored by Daniel Villa, GEOVIA Industry Process Consultant Expert.



In Article 1 of this series, an international copper mining company wanted to find out if it would be possible to increase productivity and reduce operating costs at a new block-caving project by changing their original design to accommodate new, larger load-haul-dump (LHD) machines.



Using a parametric design tool from Dassault Systèmes, they tested the effect of the larger loaders on their original mine design in four defined parameters:




tunnel spacing (production crosscut, draw bells, etc.)



undercut and extraction level elevation



offset from geology contact, east/west access tunnels, and



connection between levels through ore passes, ventilation raises, etc.




Owing to the efficiency of parametric design, these explorations took a matter of hours rather than days or weeks to accomplish, and the company was quickly ready to move to the next stage: figuring out whether these potential new designs would actually work the way they wanted them to.



Process Composer and PCBC mine planning tools



Using Dassault Systèmes’ PCBC mine planning tools combined with Process Composer, the copper company was able to easily:




integrate their parametric designs with mine planning, and



use automation and simulation of multiple scenarios to confirm whether or not changing extraction level design would in fact result in increased productivity.




To make integration simple, Process Composer eliminates the need for a designer to manually transfer the parameters and results from the parametric design into a mine planning package. Instead, it works with PCBC to handle the output variables of the design as input variables for the mine plan, allowing for an automated workflow throughout the entire mine planning process and between multiple applications/software.











In the copper company’s case, PCBC automatically analysed the various parametric designs (signified by the 3DX Parameters box above) and then handed off to other tools to:




generate new draw point distribution for each design



create draw columns, based on block model and grade distribution data, to suit each tunnel spacing



run best of height draw (BHOD) simulations to estimate economic mineable reserve, and



calculate the tonnage, dilution, and grade of copper that could be extracted with the tunnels spaced at different intervals.




Draw columns



In this image, the draw columns generated by PCBC match extraction level tunnels set at 30m:











Here, the draw columns generated by PCBC match extraction level tunnels set at 36m:











As you can see, however, the draw columns for tunnels set at 36m created much higher pillars between each draw point than tunnels set at 30m, which will affect the tonnes and grades the company will be able to extract.



Results



The copper mining company’s designer continued to use Dassault Systèmes’ PCBC mine planning tools to:




vary other physical parameters, such as draw-bell spacing, entry angles, elevation level, and size of pillars



run multiple scenarios to compare results, and



for every design generated, calculate the economic reserve and create a summary of average copper value, average economic value, and total tonnes extracted — with all physical and economic parameters mapped to, and captured in, the Design of Experiment (DOE) process.




From there, the designer employed the Process Composer results analytics program to first review all the scenarios:











And then provide 2D or 3D visualisations of each design, where each dot represents one run of the simulation, making the results very simple to analyse and compare:











After that, the designer added the favourites, based on average NSR, to a basket for more in-depth review.



What comes next



The next article in this series will look at how the copper miner employed simulation to complete advanced geotechnical analyses. The final piece will discuss how the company evaluated thousands of production scenarios before arriving at a design that would create a smaller but significantly more productive underground mine than its original mine design.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ How to Improve Block Caving Design and Planning: The Secret is Automation and Simulation ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/how-to-improve-block-caving-design-and-planning-the-secret-is-automation-and-simulation/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/273136</guid>
      <pubDate>Mon, 18 Nov 2024 10:29:14 GMT</pubDate>
      <description>
      <![CDATA[ Using Dassault Systèmes’ parametric design tool, which enables mines to virtually create, update, and analyse a design within their own operating environment, the copper company’s designer used a number of different inputs to accommodate larger LHDs.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Authored by Daniel Villa, GEOVIA Industry Process Consultant Expert.



1. The value of parametric mine design in block caving



An international copper mining company had one successful block caving project up and running and had developed a similar design for a second. Before committing to that design, however, the company wanted to find out if it would be possible to increase productivity and reduce operating costs at the new mine by changing the original design to accommodate new, larger load-haul-dump (LHD) machines.



However, the company knew that this modification could have a number of consequences. For example, it could result in changes to:




tunnel sections, making pillars smaller and reducing stability, and



the distribution of tunnels at the extraction level (shown in orange in the image below), which would mean the entire design would have to be adjusted to maintain the connection between levels.












To help the company determine exactly what effect bigger LHDs would have on the existing design, their design team used a variety of integrated software options from Dassault Systèmes — starting with our parametric design tool — to analyze:




tunnel spacing (production crosscut, draw bells, etc.)



undercut and extraction level elevation



offset from geology contact, east/west access tunnels, and



the connection between levels through ore passes, ventilation raises, etc.




Why parametric design



While traditional 2D CAD-based design certainly works, it has its downsides, including the fact that, as a manual process, it takes a great deal of time because the designer must modify the entire shape of a design in response to a single change.



By using associativity to preserve the link between reference data — such as terrains and geology or resource models — and existing infrastructure models, parametric design removes or significantly reduces the need for a designer to edit the whole design in order to modify a single design parameter. The designer is able to update designs automatically (without losing previous designs) any time there is new input data because, while the inputs may have changed, the parameters of the design have not.



In addition, parametric design allows designers to:




create and compare multiple 3D models, return to the original model and try again, or continue forward using a whole new set of parameters



run automated simulation loops to evaluate the impact of a change, and



test different hypotheses through scenario analysis.




Updating the original design



Using Dassault Systèmes’ parametric design tool, which enables mines to virtually create, update, and analyse a design within their own operating environment, the copper company’s designer used a number of different inputs to accommodate larger LHDs.



For example, the designer started by spacing the extraction level tunnels at 30m. The parametric design tool then automatically updated the undercut level to meet that criteria:











The red lines are the tunnels at the undercut level; the yellow lines are the tunnels at the extraction level, where the loaders must run to.



For comparison, the designer then widened the extraction level tunnels to 36m, and the tool &nbsp;immediately updated both the extraction and undercut level tunnels, while maintaining the same distribution of tunnels — in a matter of seconds:











After altering the tunnel spacing a few more times, the designer determined the 30m tunnel spacing was optimal and moved on to experimenting with tunnel heights and other parameters, including the offset between the tunnels and a geological structure at one end of the mine site.



The image below shows the tunnels positioned at 25m from the geological structure, shown in gray at the top:











After taking into account the most current geotechnical data, however, which indicated that this offset might not be enough for safety, the designer doubled the distance to 50m, and Parametric design again updated the whole design automatically to this:











The designer then went on to explore other parameters, including draw-bell spacing and entry angles, ore pass locations, load elevations, etc.



What comes next



The next article in this series will show how this same mine company used the Dassault Systèmes PCBC mine planning tool kit to integrate mine design with mine planning and to use automation and simulation to test those designs under real-life conditions.



The final two articles will look at how the company employed simulation to complete advanced geotechnical analyses and to evaluate thousands of production scenarios before arriving at a design for its new block caving project that would create a smaller but significantly more productive underground mine than its original mine design.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
    </channel>
   </rss>