<?xml version="1.0" encoding="utf-8"?>
  <rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom">
    <channel>
      <title>GEOVIA</title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/feed.xml</link>
      <description>GEOVIA</description>
      <lastBuildDate>Thu, 05 Mar 2026 16:10:04 GMT</lastBuildDate>
      <docs>https://validator.w3.org/feed/docs/rss2.html</docs>
      <generator>3DExperience Works</generator>
      <atom:link href="https://blog--3ds--com.apsulis.fr/brands/geovia/feed.xml" rel="self" type="application/rss+xml"/>

      <item>
      <title>
      <![CDATA[ Benefits and Challenges of Using Big Data in Resource Estimation ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/benefits-and-challenges-of-using-big-data-in-resource-estimation/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/275094</guid>
      <pubDate>Thu, 12 Dec 2024 13:23:32 GMT</pubDate>
      <description>
      <![CDATA[ The good news is that advances in data acquisition technologies mean that a whole new world of soft data – from downhole geophysics, multi- and hyper-spectral core scanning, and more routine collection of geometallurgical parameters – is now available to inform and enhance resource modelling and estimation.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Authored by Michael Mattera, GEOVIA Industry Process Consultant Senior.



To estimate the properties of a mineral deposit as reliably as possible — especially as economic orebodies around the world become increasingly complex — a geologist must thoroughly understand the deposit as well as the method of emplacement/mineralisation.



And the only way geologists can do that is through using sound, dependable data.



Yet traditionally mining companies have relied solely on two exploratory drilling methods to obtain the physical samples, typically the only working data, that resource geologists use to model and estimate mineralisation:




diamond drilling, which involves withdrawing small diameters of core rock for analysis



reverse-circulation drilling, which involves collecting crushed rock cuttings for analysis.




The result is that billion-dollar decisions are based on the physical analysis of a very small amount of material while the bulk of the material to be mined, both overburden/waste and the mineralised orebody itself, remains unexamined.



To ensure higher quality resource estimates, geologists need more data.



The good news



The base data for geological modelling and resource estimation can be classified as either hard data (data that is directly observed and measured), or soft data (which make up the bulk of what’s known as ‘big data’) from other sources.



The good news is that advances in data acquisition technologies mean that a whole new world of soft data – from downhole geophysics, multi- and hyper-spectral core scanning, and more routine collection of geometallurgical parameters – is now available to inform and enhance resource modelling and estimation.



For example, using soft data can help resource geologists detect correlations between variables that might not be immediately obvious from hard data alone, such as a subtle alteration pattern that is evident from hyper-spectral core scans but not in assay results. Including additional geometallurgical-related parameters, such as hardness or grindability, acid consumption, moisture content, or clay minerals, can also:




highlight potential processing issues or abnormal values that wouldn’t be recognised with a more limited dataset



help define trend surfaces, such as gradual changes in mean values that can be removed from the data to improve the quality of estimates



identify variables to be estimated that might not normally be included in the block models that represent the material to be mined.




Using geometallurgical and other parameters – through using self-organising maps, for example – also contributes to better domaining of the mineralisation. This is because it allows geologists to consider many more characteristics as they define which volumes of material share similar characteristics and which are distinct.



In addition:




Adding in big data — via techniques such as co-kriging using secondary variables — can help produce estimates that take more localised (at a selected mining unit scale) variations in the mineralisation into account, while still:



achieving acceptable slope of regression (a standardised measure of the quality of the estimates)



minimising conditional bias (true value is typically less than the estimate when the estimate is high, and the true value is greater than the estimate when the estimate is low).



If a mining company chooses not to complete, for reasons of time or money, a full analysis of all the attributes of all physical samples, geologists can use big data to fill in (impute) missing values using estimation techniques, proxy formulas, or correlations. Once they have all desirable attributes available for each sample, they can then return to more conventional techniques, such as kriging, to produce estimates or simulations — a set of equally probable realisations of the estimates — for all required parameters.



Incorporating big data as part of the resource modelling and estimation workflow increases the ability for resource geologists to:



highlight areas of higher risk (with, for example, elevated levels of deleterious elements or material with potential processing problems) that could be subject to additional environmental or social considerations



adopt the industry best practice scorecard approach to the classification of the Mineral Resource estimates (from low to high confidence: Inferred, Indicated and Measured)



improve mine site safety by identifying zones that might have poor ground conditions or require a change in standard mining practices to deal with (thereby introducing non-standard or unexpected behaviour).




The bad news



At the same time, however, all this additional information can result in an overload of big data, potentially many terabytes in size, that might also have varying degrees of accuracy and which must be separately validated before it can be used. 



That validation can add substantial time and effort, since the new, non-traditional data must be made to work with — and be stored and visualised alongside — the traditional physical drilling information, such as lithologies and assays, typically found in a geologist’s resource database. Finally, big data also makes (potential) automatic modelling and simulation a complex, processing-intensive task.











So what should mining companies do?



With clear advantages to using big data, despite what it demands in time and effort, miners need to keep in mind that a poor dataset will always produce a poor estimation. A good dataset, taking into account all available data, will produce an estimation that is more statistically sound, with clearly defined reasoning behind each of the decisions made along the way.



To make the best use of all available data, mining companies should consider how they want to address four specific challenges:



1. Storage



The process of acquiring, validating, and analysing the base data for resource estimation is time consuming and expensive, which means miners must consider the value of the information and knowledge derived from that data when determining how they will store it.



They must also decide how long to store it for: it may take years or even decades before a company makes the decision to mine, while the mining operation itself can take place over decades, so the lifecycle of the data is also long. But even data that is decades old can remain valid and useful for analysis/modelling if it is appropriately stored and, most importantly, still available.



Currently, however, geologists often store the initial data they collect during the exploration phase on a laptop, which both limits access to this data by other project teams and increases the risk that the data — and its potential value — could be lost at any time if a geologist changes roles or the device is retired.



2. Multiple sources



Geologists need to be able to retrieve and use information easily, but the sheer range of data now available for geological modelling and resource estimation can make that difficult.



Today, base data comes in a wide variety of types, including lab results supplied directly from Laboratory Information Management System (LIMS) systems and descriptions of the diamond drilling core from which the physical samples were extracted, as well as data from, for example, hand-held/portable X-ray fluorescent (XRF) analysers, data historians that record drilling penetration rates, and metadata — ie those additional details, such as the time of day the data was collected and the person, company, or piece of equipment that collected the data, that are vital for confirming if the data is in the original form, if it has been manipulated or adjusted, or is a calculated average.



This leads to a dataset made up of a diverse collection of text files, Excel spreadsheets, and resource models in proprietary binary format files, alongside data stored in geoscientific information management software packages and core scans, which alone can take terabytes of data, with much of it collected at different times and by different people/equipment.



3. Data lifecycle



Large amounts of data from multiple sources acquired over many years increases challenges to both data domaining (dividing the rock mass into volumes with similar characteristics that are distinct from each other) and the Mineral Resource classification process.



Resource geologists must now consider the lifecycle of the data used in resource classification and find a way to accommodate and flag drilling results and other data with lower confidence (or which failed the validation test), while not losing portions of that dataset, such as lithological/structural interpretations, that could still be used for resource modelling purposes.



Also, as more data is collected, geologists may deem historical data with no or inappropriate quality assurance/quality control (QA/QC) less reliable for use in mineral resource estimation, and must have a way to incorporate this finding into the database to ensure only the highest quality data is used. For example, if newer, more accurate collar/downhole surveys or laboratory analysis methods identify weaknesses in previously collected data, that new data could make the use of historical data (such as lithological contact positions or assay information) inappropriate, depending on how the data is used in the resource definition and estimation process.



The same might happen with biased historical data. Bias usually only becomes apparent and downgrades confidence in the data after a considerable period of time. It is crucial to maintain all metadata so that the data does not have to be revalidated before it is used in each resource update cycle.



4. Database management



In order to manage a resource estimation dataset that includes an array of big data properly, resource geologists need to be able to:




Discriminate between hard and soft data and any metadata that also needs to be included in the resource dataset, and to store their reasons for considering the data suitable for estimation or not.



Maintain the integrity of the resource dataset to ensure that the level of confidence (low to high) in the data can be used to appropriately:



classify the confidence level of the resource estimates



determine the risk profile of the decisions based on those estimates.



Control access to the database to:



ensure that only validated and approved data (as opposed to raw data on which the QA/QC has not been verified) is used in the resource estimation process



identify where other data has been confirmed as suitable only for modelling the geology (such as the extent of the mineralised lithologies) as opposed to estimating the mineral content or other properties of the material to be mined, including waste/overburden.



Provide proof of a strong chain of custody for all data that will confirm, for example, that assay data has not been manipulated. This proof will increase confidence in the estimates during external independent reviews, and illustrate that the dataset is being well governed — a vital consideration for financing.




The future is in the cloud



While these are significant challenges, they are not insurmountable, and the future for resource modelling and estimation is, in my opinion, in the cloud.



A cloud-based platform:




removes storage limitations and allows for on-demand access to both data and processing power



ensures high availability, which can replace current back-up and disaster recovery processes, except for those that are time-sensitive and can affect mining/production



provides a central location to store and share all data, with sufficient on-demand computing resources available to accommodate repeatable workflows rather than a collection of independent, difficult to back-up processes run on separate devices with limited processing power



offers the option of using standardised workflows to capture deep specialist knowledge, which then becomes permanently retained, role-based knowledge



enables processes that depend on access to powerful computing resources to be run more efficiently, both in time and cost, than using local machines with limited capabilities — for example, with a cloud-based computing platform, it becomes much more feasible to routinely undertake valuable studies when new data becomes available, such as simulating variations:



in the geology model and the resource estimates, and then building multiple mine plans based on these variations, or



in the beneficiation process when handling ore with differing chemical characteristics or ratios of ore types



makes it possible to:



quickly incorporate artificial intelligence and machine learning techniques in workflows to automate a number of time consuming and/or repetitive tasks



construct workflows to produce financial models that incorporate much more of the underlying inherent variability of the mineralisation as opposed to those based on average assumptions — making true risk-based decisions using robust confidence intervals placed on key metrics, such as Net Present Value or Internal Rate of Return, possible.




In short, by being able to store, process, integrate, share, and display all available data types required for high-quality resource modelling and estimation, a cloud-based platform will contribute to overall improved orebody knowledge and understanding of the controls on mineralisation.



This in turn will result in significant downstream benefits, including better blending of material for processing, more consistent plant throughput, and ultimately, most importantly, higher product quality and increased profit.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Building Efficient and ESG-Compliant Mines with Virtual Twins ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/building-efficient-and-esg-compliant-mines-with-virtual-twins/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/275082</guid>
      <pubDate>Thu, 12 Dec 2024 13:13:45 GMT</pubDate>
      <description>
      <![CDATA[ In order to supply demand, the mining industry must discover more copper deposits, increase inventory, and accelerate production. The challenge lies in developing mines quickly while adhering to ESG norms, environmental regulations, and community stakeholder buy-in. Integrating technology across the mining value chain can reduce uncertainty and expedite this process. 
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
By Gustavo Pilger, Director &#8211; Worldwide GEOVIA Research and Development Strategy and Management, Dassault Systèmes.



Fossil fuels currently meet about 75% of global energy needs. To transition to renewable energy sources by 2050, significantly larger quantities of minerals, especially copper, must be mined.&nbsp; The current rate of production will not suffice to meet demand. With only 800 million tons of known copper reserves, is it possible to supply this demand? If it is possible, can it be done it with minimal environmental impact while adhering to ESG norms and public scrutiny?



In order to supply demand, the mining industry must discover more copper deposits, increase inventory, and accelerate production. The challenge lies in developing mines quickly while adhering to ESG norms, environmental regulations, and community stakeholder buy-in. Integrating technology across the mining value chain can reduce uncertainty and expedite this process.



Avoiding silos in mining sub-processes with technology



Processes and sub-processes in mining typically occur in silos, leading to frustration among stakeholders due to inefficiencies that often lead to disruption and rework. Engagement is key to foster collaboration that can break down silos and bring teams together toward achieving common goals. It is important to communicate effectively in order to take stakeholders on a journey that results in common understanding of contexts, objectives, constraints, risks, planning, and risk mitigation measures to achieve a given target. This could be greatly facilitated if all stakeholders work in a collaborative manner on a unified platform with consistent data models and user experience.



Mining could be seen as a series of interconnected systems, or a ‘system of systems,’ that includes several processes from securing permits and exploration down to development, production, beneficiation, sales and distribution, decommissioning, and site rehabilitation. How do you connect systems, technology and people to ensure this system of system “machine” works in a synchronized and harmonious manner while constantly chasing value and common KPIs? A first logical step is to clearly delineate the boundaries of these systems and identify how data flows or how it should flow within them.



These system of systems framework must recognize a constellation of systems and must understand that the output of one system is the input of an adjacent one. This framework must also be aware of the decisions made within each system’s boundaries as well as the implications or consequences to upstream and downstream processes. Virtual twin experiences provide this framework.




“Dassault Systèmes’ solution for sustainable, energy-efficient mining that meets environmental norms and revenue targets centers around virtual twin experiences.”








Virtual twins help embed sustainability in mining operations











The virtual twin provides a live, virtual replication of the real world, where processes and systems are interlinked and associated with one another. This includes the underlying data that informs and describes these processes, all interconnected from a multiphysics, multiscale, and multidisciplinary perspective.



With virtual twin experiences, associated data and intelligent methods help mining organizations pursue value throughout their operations, adapting to uncertainty and unplanned events of technical, mechanical, or market origins.



Can you change the way you operate by designing mines that extract more minerals more efficiently, while minimizing energy use, and complying with ESG standards and environment norms? Yes, the virtual twin can assist in this transformation. It can model and simulate many possible outcomes, environmental and social scenarios, balancing efficiency and costs with ESG norms, and while maximizing safety and value.




“By connecting various data and systems, the virtual twin creates a unified collaborative environment, allowing stakeholders to identify priorities and measure performance against benchmarks for responsible and sustainable growth.”




The virtual twin also enables the management to control permit status, asset agreements, asset licenses, and associated cost analysis to ensure that everything proceeds according to plan.



The virtual twin ensures continuity between the natural environment, claim boundaries, and built infrastructure. Users can leverage immersive visualization with spatial contextualization to gain a comprehensive view of data for actionable insights. This ensures sustainability concerning energy and emissions, water, and biodiversity. Environmental, social, regulatory, and sustainability KPIs can be measured against benchmarks using the virtual twin.



The virtual twin also provides visibility into data with powerful integrated analytics and geospatial data. It allows the aggregation and propagation of data on land stewardship in line with company, compliance, and regulatory frameworks.




“Digital communities created through the virtual twin facilitate two-way data and model sharing, ensuring a unified user experience for all stakeholders.”








Why virtual twins are essential for energy efficiency in mining



Mining is one of the most power-intensive industries. Collectively, mining consumes about 11% of the world’s energy, primarily from fossil fuels. Therefore, powering mining and related infrastructure with renewable energy sources is essential for sustainability. Renewable energy also offers opportunities for cost savings, innovative mine designs, and resilience against uncertainties.




“With the virtual twin, mining energy systems can be designed as a single platform, allowing for the design and simulation of energy supply and sources.”




Many mine operators are likely to adopt a hybrid approach to energy. The virtual twin supports a hybrid approach to energy efficiency in mining by using advanced controls that enable miners to design, simulate and visualize optimal energy configuration regimes while reducing overall operating costs, therefore de-risking the electrification process.



In short, virtual twin experiences should play a major role for building the next generation of copper mines and to attract an ever-increasing digital-native workforce in order to achieve sustainability goals and ESG mandates. Virtual twins are not only essential for de-risking mining projects from a multidisciplinary perspective but also for navigating business complexity while ensuring sustainability. Ultimately, virtual twin experiences are a fundamental communication tool for engaging communities where mines operate as well as governments, regulatory agencies, shareholders, and employees from the outset while fostering sustainable mining projects with shared interests.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Improving Geological Modelling in the Age of Data Overload ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/improving-geological-modelling-in-the-age-of-data-overload/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/275067</guid>
      <pubDate>Thu, 12 Dec 2024 12:06:05 GMT</pubDate>
      <description>
      <![CDATA[ However, while today’s big data and advanced technology can help geologists generate more accurate models than ever before, there are still challenges to overcome in using all that data and technology successfully. Fortunately, there are also solutions.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Authored by Jacques Nel, GEOVIA Senior Industry Process Consultant.



Geologists face many challenges in dealing with the unprecedented amount of geoscience data available today. In this article, Jacques Nel discusses software developed by Dassault Systèmes with the potential to help mines to rise above the chaos and use all of their data to generate more accurate geological models that give them a critical strategic edge.



A geological model is a specific representation of the location, characteristics, and extent of the lithology and ore types of a mineral deposit. It is based on the knowledge of the geologist and on data gathered through such sources as geological field observations, drillhole records, geophysical surveys, and assays. It is a vital input for both the resource model and mine planning, and affects virtually every decision made throughout the mining process.



To put it another way, the success or failure of a mine largely rests on the shoulders of the geologist responsible for the original geological model.



However, while today’s big data and advanced technology can help geologists generate more accurate models than ever before, there are still challenges to overcome in using all that data and technology successfully. Fortunately, there are also solutions.



Challenge 1: Importing and managing the data required for an accurate geological model



Geoscience data is key to any technical or financial evaluation of a mineral deposit because it helps define the site, shape, and grade of the orebody.



Today, however, geologists are inundated with so much new geoscience data — structural, geochemical, lithological, remote sensing, etc. — that they often don’t know what to do with it all. At the same time, valuable historical data may be stored on a number of different servers, on staff members’ personal laptops, on outmoded disks or drives, or even in long-forgotten physical filing cabinets. There may also be so many different versions of some datasets that it’s impossible to know which is the latest.



All of this can lead to a lack of confidence in the data used for modelling and downstream decision-making processes, and/or to mines using valuable time and money to re-explore or re-acquire lost data. It also makes it difficult for mines to comply with Environmental, Sustainability, and Governance (ESG) regulations that call for them to become more transparent in how they acquire, store, manipulate, and use geoscientific data.



Challenge 2: Visualising and analysing the data required for an accurate geological model



Data visualisation gives geologists insight into their geoscience information and helps them identify geological trends or patterns as well as erroneous data. Data visualisation, however, like data management, has been made much more complicated by the vast quantities of data available today from both exploration and operational projects — most of it in different data types from different sources in a variety of file formats.



Those different sources and formats make it difficult for geologists to visualise and analyse their data in one application, meaning they could miss important trends or patterns, while the workflows they have access to now to integrate separated data may be too time-consuming or unreliable to use. The result is that valuable geoscientific data may be inadvertently excluded from the geoscientific analysis, reducing the accuracy of the geological model.



Challenge 3: Interpreting and modelling the geology of a mineral deposit



Throughout much of the world, regulatory bodies require geologists to adhere to strict standard operating procedures and regulations when generating geological models. This is not easy considering:




the number of tasks that must be completed to generate a model



the amount of data linked to each task



the timeframe allocated for each task, and



the complexity of keeping track of tasks, data, and timeframes.




But failing to adhere to these operating procedures and regulations is not an option, because it could lead to non-compliance, which would have a negative impact on the mine and/or mine company.



Having data stored in multiple repositories in many different file types compounds this challenge by forcing geologists to interpret data in several separate applications, which once again (see Challenge 2 above) may lead to them missing critical trends or patterns, and then producing incorrect interpretations and inaccurate geological models.



Challenge 4: Validating and sharing a geological model



Once a geologist has interpreted geoscience data and created a base geological model, there are a number of different methods they can use to ensure it has been generated as accurately as possible. These include:




doing a geostatistical simulation to construct geological scenarios that can test or quantify the uncertainty of the model



conducting a visual analysis of the domains compared to the actual data



using query filters to test the model



making a direct comparison of the new geological model to the previous model, and/or



sharing the model across the organisation for collaborative peer review.




The challenge for many geologists right now, however, is that they may have only a single technique or application they can use to validate their model, which means they may miss certain issues that can render the model — and all the critical mine planning decisions made based on it — inaccurate. This in turn can lead to imprecise mining optimisations, possible safety issues, and less-than-expected production.



Sharing the model for peer review may also be difficult. Departments in the mining industry have traditionally acted in silos, which has made cross-departmental collaboration challenging. 



Many also still use email or flash drives to share important files, which makes it hard to be sure which version is the latest, and not all departments may have access to applications where they can view or statistically analyse the model, and acquiring new licences can be costly.



Addressing the challenges



Overcoming all four of these challenges — the need for secure, effective data management; comprehensive geological visualisation and analysis; correct data interpretation; and accurate model validation — begins with ensuring that all geoscience data is integrated, stored, analysed, interpreted, and managed in a single, platform-based centralised repository.



This centralised repository will ensure that the latest, most complete version of any data becomes the single source of truth for everyone to reference. 



And it has the added bonus of helping to minimise the financial cost of obtaining geoscience data by increasing the percentage of it actively used in creating models.











One example of a platform-based centralised repository is Dassault Systèmes 3DEXPERIENCE platform. To assist with geological modelling, we have also developed a customised Geology Modelling repository, where all of a mine’s geoscience data is first securely stored on the platform, either on a mine company’s own premises or on a public or private cloud. 



From this central hub, industry-proven applications, tools, and workflows make it far simpler for geologists to locate, interpret, display, and analyse their data, as well as to create, validate, and share geological models.



It works like this: The 3DEXPERIENCE platform connects to both GEOVIA Surpac geology and mine planning software and ENOVIA project and document management software. 



This enables geologists to drag and drop any data (drillhole, topographical point cloud, geophysical, assay, geotechnical, etc) held on the platform in any format into the GEOVIA Surpac graphics window to begin work immediately — no data conversion or lengthy import or export processes required. 



This kind of integration also has the benefit of automatic document versioning, data check-in/check-out, and user file and folder permissions to ensure traceability and accountability.



The software combination also gives geologists the ability to:




synchronise data for fast 3D visualisation



apply geological reasoning and logic to sculpt domain solids from all available data



create and compare various interpretations for “what if” scenario analysis to quantify the natural uncertainty



track the entire evolution of the geology model from data interpretation to model generation and validation



create a project plan, assign tasks, and monitor project progress (and any issues or bottlenecks) against the plan using a variety of visual methods, and



share the 3D geological model and any statistical reports and charts with stakeholders through customisable dashboards or communities.




The result is improved compliance with Standard Operating Procedures, ESG regulations, and the mine’s own KPIs. It also allows greater confidence in the quality of the geoscience data and the accuracy of the geological model, and better decision-making throughout the life of the mine.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Unlocking Value in Mining ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/unlocking-value-in-mining/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/274447</guid>
      <pubDate>Mon, 02 Dec 2024 11:59:53 GMT</pubDate>
      <description>
      <![CDATA[ The bottom line is that a centralised system deployed either on premises or on cloud would make the execution of the company’s Disaster Recovery Plan way more straightforward compared with decentralised systems.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Gustavo Pilger, Dassault Systèmes, GEOVIA on the critical role of centralized data management for efficiency, security, and innovation in mining operations.



The rapid growth of data, driven by technological advancements, presents both benefits and risks. Consolidating and centralizing data management is important to enhance efficiency, cybersecurity, and business intelligence. We discuss the critical challenges and opportunities in managing data within the mining industry and also explore the role of Dassault Systèmes&#8217; 3DEXPERIENCE platform in enabling mines to optimize processes, ensure data security, and improve disaster recovery capabilities.



Q) Please outline the key challenges that the mining industry faces in managing, protecting, and storing data.



Data is part of the IP portfolio of a company (together with a range of assets).  Therefore, it should be managed as any other valuable asset. Over the last 2 decades in particular, with technology advancements and the advent of a range of sensors, we have seen an &#8220;explosion of data&#8221; across industries including mining. 



This brings opportunities and challenges at the same time. The opportunities are mainly associated with the potential to better understand processes enabling one to improve them with productivity and efficiency gains that often lead to cost savings.



To achieve this state, however, one needs to overcome a few challenges: from navigating through a plethora of data for extracting knowledge to cybersecurity risks that could expose corporations to significant financial losses. The ability to count with a range of data to unlock or optimize mining processes is great. 



However, one of the first challenges is to consolidate the data that is often captured and stored in different systems. Not only these data are stored in decentralised (local) disparate repositories, but these systems are administered by different people with different levels of responsibility and awareness when it comes to data integrity and related risks. 



So, it is important that data is properly stored and managed in a way that allows one to extract the most knowledge out of them while preserving its integrity and exposure.



Q) How should mining companies approach consolidating and centralizing their data management to enhance data security?



The first step towards data consolidation is to compile a data inventory across the mine including information about type, format, purpose, frequency of change, etc. This allows one to map out the data flow intra- and inter-processes across the mine to then assess what matters the most and where potential bottlenecks are in order to prioritise where to begin. 



Therefore, understanding the data ecosystem together with the impact they have across KPIs is key to drive change in this space.



All sorts of data are being collected from a range of equipment (including sensors) across the environment of a mine. Together with good, valuable data also comes noisy data &#8211; and lots of them.  



Therefore, ideally, the data collected across the mine not only needs to be federated (or consolidated), but also needs to be indexed, sanitized (filtering out the noise), and contextualized so that meaningful insights can start to be extracted for decision-making.  



This could be achieved with the adoption of a centralized system that allows ingesting data collected by equipment across the mine, as well as their management in a safe and secure environment. The Dassault Systèmes 3DEXPERIENCE platform offers this solution.



Q) What critical benefits do mines gain from centralizing their data management?



I think the ultimate benefit is about being in control of the data instead of data taking control! One can only improve what is measured and understood! 



A centralized platform that allows data federation, indexation, 3D contextualization, analytics and action management, all in a secure environment, puts you in control of your assets allowing to extract the most value out of them.




Also, typically with decentralised systems, a great amount of time is dedicated to finding the right data or the latest version of data to work with. This translates to enormous inefficiencies, errors, re-work, and frustration leading to employee disengagement creating a vicious cycle of inefficiency. On the other hand, a centralised system, with rigorous access control processes, eliminates these inefficiencies. 




Every employee has access to the right data, in terms of permissions and versioning for conducting his/her work. Every decision taken by employees is recorded and justified within the system providing an inherent layer of traceability and auditability. Other benefits include de-risking data integrity and exposure.



Q) Tell ua about the role of centralized data management in improving data analytics and business intelligence, and how this benefits mines and their personnel.



GEOVIA, a Dassault Systèmes brand, provides software tools that allow our mining clients to model and simulate processes and how they interact with adjacent (connected) processes before anything is actually built, in early project development phases, or to correct the train of action on projects already in production in order to keep chasing value while operating.



Since the underlying data is federated, indexed, standardized and contextualized in a safe and secured single repository, and systems are connected with input and output associated through common data models, one can test multiple hypotheses or scenarios in the virtual world (Virtual Twin Experience) to efficiently apply a given design or plan in the real world &#8211; eliminating unnecessary waste, reducing risk, minimizing material re-handling while maximizing productivity! 



Data is not only safe and secured, but it is indexed (for quick retrieval), standardized through semantic dictionaries and contextualized, enabling meaningful link and associativeness between processes and data.



It is this data associativeness combined with smart methods and algorithms that allows one to constantly chase value while in operation, adjusting to (previous) uncertainty and unplanned events (being of technical, mechanic, or of market nature). 



I’d like to emphasize that having this core data, industry knowledge and know-how supported by semantic dictionaries (ontologies) central to our business platform (3DEXPERIENCE) that is built on a multi-physics and multi-scale foundation allows us to go beyond Generative AI and Large Language Models (LLMs). 



With this core set of characteristics, what we offer instead is Industry Language Models (ILMs) that indeed leverage LLMs but are combined with ontologies and industry knowledge and know-how within a platform environment (3DEXPERIENCE) that inherently provides governance and traceability.



Q) Please explain the ways in which centralized data management enhances a mine’s disaster recovery capabilities and why this is critically important?



A&nbsp;decentralised data management system, with data fragmented and scattered across the corporation, would need to rely on systematic discipline by personnel in charge to regularly back up local stored data, which could be a challenge by itself. Therefore, it&nbsp;would make it really hard (if not impossible)&nbsp;to fully recover should a disaster were to occur.



Instead, a centralised system can be restored in a matter of hours in case of disaster. Of course, assuming appropriate levels of redundancy, training and protocols would be in place to allow minimum levels of disruption in case of disaster.



The bottom line is that a centralised system deployed either on premises or on cloud would make the execution of the company’s Disaster Recovery Plan way more straightforward compared with decentralised systems.



Q) Ultimately, how does centralizing data management improve both a mine’s cybersecurity and the safety of its employees?



Data centralisation enables to significantly reduce risks associated with data integrity and cybersecurity. Consolidating the data in a single repository reduces the risk of losing or corrupting data that otherwise would reside in local drives of desktop computers located across mine sites, or into laptops of those employees required to work on the data. 



Instead, on a centralised system such as the 3DEXPERIENCE, the right version of the right data is available at any time to the right people. Since 3DEXPERIENCE counts with a rigorous access control process, this means that data is made available to employees according to their roles and needs. 




For example, a Surveyor does not need access to sensitive data such as gold grades from core logging, while a Resource Geologist needs it as it is required for him/her to conduct their work. So, all this combined mitigates quite significantly risks associated with data integrity, exposure and cybersecurity.




For those who choose to embrace the cloud to store and manage data via a cloud provider, be assured that the risks are well managed. Risks are arguably better managed than in in-house data centres. 



This is because most cloud vendors, such as Dassault Systèmes, operate with heightened security practices tailored towards protecting their infrastructure, applications, and customer data. A good cloud provider will adhere to industry standards and best practices that include:




IOS 2700x standards, and in particular implementation Guide ISO 27002



NIST 800 series



OWASP (Open Web Application Security Project) methodologies



CobIT framework




Also, good cloud providers employ multiple, independent and redundant mechanisms at various levels to block attacks. These measures provide far better security than most organisations can provide for themselves.



Therefore, in terms of risk management, it is a win-win proposition for all, including corporations, employees, contractors, and customers.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Bridging the Gap in Real Estate with Virtual Twin Solutions ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/bridging-the-gap-in-real-estate-with-virtual-twin-solutions/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/273368</guid>
      <pubDate>Wed, 20 Nov 2024 11:14:57 GMT</pubDate>
      <description>
      <![CDATA[ GEOVIA Urban Planning solutions allow the creation of a comprehensive virtual twin within a single platform, seamlessly integrating both open-source and private BIM data. By reconciling 2D and 3D data, converting vector data into 3D, and georeferencing 3D designs, these solutions go far beyond basic visualizations. They offer advanced spatial analysis capabilities, enabling real estate developers and planners to assess site viability, evaluate environmental impacts, and simulate various scenarios effectively.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Authored by @Fabrice SERVANT, GEOVIA Customer Success Director, Dassault Systèmes



The real estate industry is undergoing rapid transformation, driven by global urbanization, evolving lifestyles, and the critical need for sustainable practices. In response, real estate professionals are increasingly exploring digital solutions that improve efficiency foster&nbsp;collaboration, and optimize land use. At the forefront of these innovations is virtual twin technology, which allows developers and planners to visualize and simulate projects within their full spatial and environmental context. This is what Dassault Systèmes’ GEOVIA presents in its latest webinar on land prospecting.



Creation of a Virtual Twin



GEOVIA Urban Planning solutions allow the creation of a comprehensive virtual twin within a single platform, seamlessly integrating both open-source and private BIM data. By reconciling 2D and 3D data, converting vector data into 3D, and georeferencing 3D designs, these solutions go far beyond basic visualizations. They offer advanced spatial analysis capabilities, enabling real estate developers and planners to assess site viability, evaluate environmental impacts, and simulate various scenarios effectively.



Figure 1 &#8211; BIM model of the Dassault Systèmes WOOD building in the context of the Vélizy campus in France



Site Selection



For real estate developers, location is a primary driver of property value, making site selection a critical step before breaking ground. Depending on the type of project, factors like access to transportation, nearby schools, and cultural sites can greatly influence site selection. While much of this information is available through open data, meaningful analysis becomes more impactful when projected onto a virtual twin.



GEOVIA’s solutions go beyond traditional analysis, empowering land prospectors to create adaptive designs to meet specific requirements. By selecting a starting point on a map, prospectors can accurately analyze the surrounding area within the isochrones of their choice—like a 5-minute walk or a 10-minute bike ride- to gain precise insights.



Figure 2 &#8211; GEOVIA livability scoring analysis







This analysis can be enhanced to meet the project’s objectives by establishing a scoring system based on points of interest, such as schools, hospitals, and museums, and comparing these scores across multiple locations to identify the optimal sites. Beyond helping developers determine the ideal location, this solution also facilitates communication with non-technical stakeholders. By simulating real-world scenarios, professionals can refine their projects to reflect emerging trends, ensuring they stay relevant and appealing to prospective buyers.



Enhancing Collaboration and Decision-Making



GEOVIA’s solutions are integrated into Dassault Systèmes’&nbsp;3DEXPERIENCE platform – a secure, web-based, collaboration hub that allows stakeholders to aggregate, store, and share documents within the virtual twin environment with a single login. Serving as a single source of truth, this platform ensures that all stakeholders—from project managers to general contractors—have easy access to the most current information. This capability enhances traceability, facilitates the early identification of potential issues, and helps optimize workflows.



Figure 3 &#8211; 3DEXPERIENCE platform Idea Funnel







In addition to allowing users to monitor project progress by location, track real-time operations, and manage upcoming projects, the&nbsp;3DEXPERIENCE platform serves as a powerful collaboration tool. It enables stakeholders to interact and communicate within private or public communities directly on a 3D map. By clicking on a specific location of interest on the 3D map, stakeholders can create and access geolocated posts that provide the latest updates on new and upcoming developments. These posts can include text, images, documents, and links to the map, offering a detailed view of the site in a 3D context. This functionality preserves all relevant information throughout the project lifecycle and also empowers stakeholders to make faster, more informed decisions.



Looking Ahead



GEOVIA offers unparalleled insights through interactive market analysis. Visualize and analyze key real estate data in context, identifying opportunities that might otherwise remain hidden. This strategic advantage allows you to confidently make data-driven decisions, maximizing return on investment and optimizing resource allocation.



Discover how GEOVIA can turn your ideas into reality with a personalized demo. Explore how to analyze, collaborate, and manage your construction projects with&nbsp;GEOVIA Solutions&nbsp;and how virtual twins can redefine urban development. If you missed the &#8220;Maximize Your Real Estate Operations&#8221; webinar, watch the replay for practical tips and valuable insights. The webinar is in French, with English subtitles.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Generating Production Scenarios ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/generating-production-scenarios/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/273287</guid>
      <pubDate>Tue, 19 Nov 2024 11:58:16 GMT</pubDate>
      <description>
      <![CDATA[ Relying on Dassault Systèmes’ Delmia production simulation tool, which can create 3D models of specific, discrete events, the company developed thousands of scenarios to visualise how equipment would move and interact under various mine designs.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
In this series of articles so far, we have illustrated how an international copper mining company went about finding out the answer to a simple question: would it increase productivity and reduce operating costs of they changed the design of a new block-caving project to accommodate larger load-haul-dump (LHD) machines?



Now we are ready to tell you what the company discovered.



After using Dassault Systèmes’:




parametric design tool to test the effect of the larger LHDs on their original mine design in a number of key parameters, such as tunnel section and spacing, pillar size and undercut and extraction level elevation



PCBC mine planning software and other tools to automatically analyse selected parametric designs, calculate the economic reserve, and create a summary of average copper value, average economic value, and total tonnes extracted, and



Abaqus geomechanical simulation software to analyse the geotechnical aspects of the deposit,




the company determined that the extraction strategy they had been pursuing in their selected designs was too fast. It was clear that they would need to extract material much more slowly into order to allow enough time for the cave to mature and propagate.



With that vital decision made, and new designs with a longer time period generated, the mine was ready for the final step: production simulations.



Production challenges



Relying on Dassault Systèmes’ Delmia production simulation tool, which can create 3D models of specific, discrete events, the company developed thousands of scenarios to visualise how equipment would move and interact under various mine designs.



This video,&nbsp; for example, illustrates how a loader might move to a draw point, gather the material, and dump it into the ore pass.











This video of an open pit operation shows another type of equipment simulation (the basics are the same for block caving) available through DELMIA.











Simulations like these enable mine designers to evaluate a range of production-related issues, including equipment movement, and, in the case of the copper miner we have been focussing on through this series of articles, the effect of bigger or smaller equipment:




Do the roads get more congested with larger LHDs?



Will larger LHDs allow the mine to produce what they expect to produce?




Results



After the copper mining company ran these final production simulations, they had all the answers they needed to determine whether they should stick with their original design for a new block-caving project, or change it to accommodate new, larger LHD machines.



The larger LHDs won the day.



Summary



Through parametric design, automation, and simulation — which took a matter of hours rather than weeks to accomplish — the company was able to create a final mine design that took into account:




optimal tunnel spacing, heights, and other parameters, including the offset between the tunnels and a geological structure at one end of the mine site, as well as draw-bell spacing and entry angles, ore pass locations, load elevations, etc.



the best possible economic reserve, average copper value, average economic value, and total tonnes extracted



geotechnical issues associated with the site in order to reduce the risks of poor caveability, large subsidence, air blasts, etc., and



the most beneficial extraction strategy for the site.




The result was a smaller mine than the copper company originally designed, with a significantly slower extraction strategy than they anticipated.



Instead of being disappointed with these results, however, the company was delighted.



Created with Dassault Systèmes software and expertise, the new, smaller mine design for the block-caving site is estimated to:




improve productivity by 20%



reduce operating costs by 10%, and



allow the company to save around $700 million US on the budget authorized for the original block-caving project with smaller LHDs.








Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Using Simulation to Complete Advanced Geotechnical Analyses ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/using-simulation-to-complete-advanced-geotechnical-analyses/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/273267</guid>
      <pubDate>Tue, 19 Nov 2024 11:50:06 GMT</pubDate>
      <description>
      <![CDATA[ Block caving requires a very large deposit, with sufficient height and footprint area, to be cost effective. It should also ideally include certain geotechnical characteristics, such as pre-existing rock fractures to speed fragmentation and enough rock mass strength to support extraction tunnels.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Authored by Daniel Villa, GEOVIA Industry Process Consultant Expert.



In Article 1 of this series, an international copper mining company wanted to find out if it would be possible to increase productivity and reduce operating costs at a new block-caving project by changing their original design to accommodate new, larger load-haul-dump (LHD) machines.



Using a parametric design tool from Dassault Systèmes, the company tested the effect of the larger LHDs on their original mine design in a number of key parameters, such as tunnel spacing and undercut and extraction level elevation. From there, as described in Article 2, they used Dassault Systèmes’ PCBC mine planning software and other tools to automatically analyse selected parametric designs and, for each design, to:




generate new draw point distribution



create draw columns, based on block model and grade distribution data, to suit each tunnel spacing



run best of height draw (BHOD) simulations to estimate economic mineable reserve, and



calculate the economic reserve and create a summary of average copper value, average economic value, and total tonnes extracted — with all physical and economic parameters mapped to, and captured in, the DOE.




After the company’s mine planner selected the favourite scenarios, based on economic results, it was time to use simulation to complete advanced geotechnical analyses.



Block Caving’s Geotechnical Challenges



Block caving requires a very large deposit, with sufficient height and footprint area, to be cost effective. It should also ideally include certain geotechnical characteristics, such as pre-existing rock fractures to speed fragmentation and enough rock mass strength to support extraction tunnels.



But those characteristics can be hard to assess, and there is always the risk that the deposit will simply be too solid to cave or that it will collapse unpredictably, making it difficult to extract the ore efficiently and also potentially hazardous for workers and equipment. In addition, mines do not want to design a block caving project that will lead to air gaps during cave propagation, which can cause dangerous air blasts.











Geomechanical Simulation



Geomechanical simulation can help mines understand factors such as:




whether there is enough ground support for a block-caving project



if a particular design will result in subsidence and a large crater on the surface, and



how large the economic recovery might be.




The copper company was particularly concerned that, while it looked like they had a good formation of high-grade material within their proposed mine design, there may be problems with vertical cave propagation. Specifically, they feared that the formation may not be able to propagate fast enough to avoid air gaps, which would have a major effect on extraction level location and extraction strategy.



In order to better understand and calibrate cave-back stress and the potential for breakthrough, the company’s mine designer took the draw points identified in Dassault Systèmes’ PCBC mine planning software and sent them to our geotechnical simulation software, Abaqus.



Watch Video Here



From there, Abaqus ran a series of scenario simulations using a variety of inputs — including swell factor, friction angle, cohesion, strain value, and principal stresses — to reveal the answer to what could be a billion-dollar question: where the limit between broken and solid material should be located.



















Results



For the copper company, the geotechnical simulations revealed that the extraction strategy they had been pursuing was wrong. It was too aggressive and would generate a large air gap very quickly.



As a result, the designer changed the strategy to pull material more slowly and allow enough time for the cave to mature and propagate — a change that would have a major impact on NPV.



This graphic shows the difference in NPV between two possible extraction strategies:



“Investigating Economic and Risk Metrics Using Design of Experiments in Fully Coupled Caving Geomechanics Simulation” (Caving 2022. S. Arndt, D. Villa, F. Khodayari, B. Ndlovu.)



What comes next



The final article in this series looks at how the real-life copper company evaluated thousands of production scenarios before arriving at a design that would create a smaller, but significantly more productive underground mine than its original mine design.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ Integrating Parametric Design with Mine Planning ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/integrating-parametric-design-with-mine-planning/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/273168</guid>
      <pubDate>Mon, 18 Nov 2024 10:42:59 GMT</pubDate>
      <description>
      <![CDATA[ To make integration simple, Process Composer eliminates the need for a designer to manually transfer the parameters and results from the parametric design into a mine planning package. Instead, it works with PCBC to handle the output variables of the design as input variables for the mine plan, allowing for an automated workflow throughout the entire mine planning process and between multiple applications/software.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Authored by Daniel Villa, GEOVIA Industry Process Consultant Expert.



In Article 1 of this series, an international copper mining company wanted to find out if it would be possible to increase productivity and reduce operating costs at a new block-caving project by changing their original design to accommodate new, larger load-haul-dump (LHD) machines.



Using a parametric design tool from Dassault Systèmes, they tested the effect of the larger loaders on their original mine design in four defined parameters:




tunnel spacing (production crosscut, draw bells, etc.)



undercut and extraction level elevation



offset from geology contact, east/west access tunnels, and



connection between levels through ore passes, ventilation raises, etc.




Owing to the efficiency of parametric design, these explorations took a matter of hours rather than days or weeks to accomplish, and the company was quickly ready to move to the next stage: figuring out whether these potential new designs would actually work the way they wanted them to.



Process Composer and PCBC mine planning tools



Using Dassault Systèmes’ PCBC mine planning tools combined with Process Composer, the copper company was able to easily:




integrate their parametric designs with mine planning, and



use automation and simulation of multiple scenarios to confirm whether or not changing extraction level design would in fact result in increased productivity.




To make integration simple, Process Composer eliminates the need for a designer to manually transfer the parameters and results from the parametric design into a mine planning package. Instead, it works with PCBC to handle the output variables of the design as input variables for the mine plan, allowing for an automated workflow throughout the entire mine planning process and between multiple applications/software.











In the copper company’s case, PCBC automatically analysed the various parametric designs (signified by the 3DX Parameters box above) and then handed off to other tools to:




generate new draw point distribution for each design



create draw columns, based on block model and grade distribution data, to suit each tunnel spacing



run best of height draw (BHOD) simulations to estimate economic mineable reserve, and



calculate the tonnage, dilution, and grade of copper that could be extracted with the tunnels spaced at different intervals.




Draw columns



In this image, the draw columns generated by PCBC match extraction level tunnels set at 30m:











Here, the draw columns generated by PCBC match extraction level tunnels set at 36m:











As you can see, however, the draw columns for tunnels set at 36m created much higher pillars between each draw point than tunnels set at 30m, which will affect the tonnes and grades the company will be able to extract.



Results



The copper mining company’s designer continued to use Dassault Systèmes’ PCBC mine planning tools to:




vary other physical parameters, such as draw-bell spacing, entry angles, elevation level, and size of pillars



run multiple scenarios to compare results, and



for every design generated, calculate the economic reserve and create a summary of average copper value, average economic value, and total tonnes extracted — with all physical and economic parameters mapped to, and captured in, the Design of Experiment (DOE) process.




From there, the designer employed the Process Composer results analytics program to first review all the scenarios:











And then provide 2D or 3D visualisations of each design, where each dot represents one run of the simulation, making the results very simple to analyse and compare:











After that, the designer added the favourites, based on average NSR, to a basket for more in-depth review.



What comes next



The next article in this series will look at how the copper miner employed simulation to complete advanced geotechnical analyses. The final piece will discuss how the company evaluated thousands of production scenarios before arriving at a design that would create a smaller but significantly more productive underground mine than its original mine design.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ How to Improve Block Caving Design and Planning: The Secret is Automation and Simulation ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/how-to-improve-block-caving-design-and-planning-the-secret-is-automation-and-simulation/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/273136</guid>
      <pubDate>Mon, 18 Nov 2024 10:29:14 GMT</pubDate>
      <description>
      <![CDATA[ Using Dassault Systèmes’ parametric design tool, which enables mines to virtually create, update, and analyse a design within their own operating environment, the copper company’s designer used a number of different inputs to accommodate larger LHDs.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Authored by Daniel Villa, GEOVIA Industry Process Consultant Expert.



1. The value of parametric mine design in block caving



An international copper mining company had one successful block caving project up and running and had developed a similar design for a second. Before committing to that design, however, the company wanted to find out if it would be possible to increase productivity and reduce operating costs at the new mine by changing the original design to accommodate new, larger load-haul-dump (LHD) machines.



However, the company knew that this modification could have a number of consequences. For example, it could result in changes to:




tunnel sections, making pillars smaller and reducing stability, and



the distribution of tunnels at the extraction level (shown in orange in the image below), which would mean the entire design would have to be adjusted to maintain the connection between levels.












To help the company determine exactly what effect bigger LHDs would have on the existing design, their design team used a variety of integrated software options from Dassault Systèmes — starting with our parametric design tool — to analyze:




tunnel spacing (production crosscut, draw bells, etc.)



undercut and extraction level elevation



offset from geology contact, east/west access tunnels, and



the connection between levels through ore passes, ventilation raises, etc.




Why parametric design



While traditional 2D CAD-based design certainly works, it has its downsides, including the fact that, as a manual process, it takes a great deal of time because the designer must modify the entire shape of a design in response to a single change.



By using associativity to preserve the link between reference data — such as terrains and geology or resource models — and existing infrastructure models, parametric design removes or significantly reduces the need for a designer to edit the whole design in order to modify a single design parameter. The designer is able to update designs automatically (without losing previous designs) any time there is new input data because, while the inputs may have changed, the parameters of the design have not.



In addition, parametric design allows designers to:




create and compare multiple 3D models, return to the original model and try again, or continue forward using a whole new set of parameters



run automated simulation loops to evaluate the impact of a change, and



test different hypotheses through scenario analysis.




Updating the original design



Using Dassault Systèmes’ parametric design tool, which enables mines to virtually create, update, and analyse a design within their own operating environment, the copper company’s designer used a number of different inputs to accommodate larger LHDs.



For example, the designer started by spacing the extraction level tunnels at 30m. The parametric design tool then automatically updated the undercut level to meet that criteria:











The red lines are the tunnels at the undercut level; the yellow lines are the tunnels at the extraction level, where the loaders must run to.



For comparison, the designer then widened the extraction level tunnels to 36m, and the tool &nbsp;immediately updated both the extraction and undercut level tunnels, while maintaining the same distribution of tunnels — in a matter of seconds:











After altering the tunnel spacing a few more times, the designer determined the 30m tunnel spacing was optimal and moved on to experimenting with tunnel heights and other parameters, including the offset between the tunnels and a geological structure at one end of the mine site.



The image below shows the tunnels positioned at 25m from the geological structure, shown in gray at the top:











After taking into account the most current geotechnical data, however, which indicated that this offset might not be enough for safety, the designer doubled the distance to 50m, and Parametric design again updated the whole design automatically to this:











The designer then went on to explore other parameters, including draw-bell spacing and entry angles, ore pass locations, load elevations, etc.



What comes next



The next article in this series will show how this same mine company used the Dassault Systèmes PCBC mine planning tool kit to integrate mine design with mine planning and to use automation and simulation to test those designs under real-life conditions.



The final two articles will look at how the company employed simulation to complete advanced geotechnical analyses and to evaluate thousands of production scenarios before arriving at a design for its new block caving project that would create a smaller but significantly more productive underground mine than its original mine design.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
<item>
      <title>
      <![CDATA[ A New, More Strategic, Way to Mine ]]>
      </title>
      <link>https://blog--3ds--com.apsulis.fr/brands/geovia/a-new-more-strategic-way-to-mine/</link>
      <guid>https://blog--3ds--com.apsulis.fr/guid/272787</guid>
      <pubDate>Thu, 14 Nov 2024 09:32:35 GMT</pubDate>
      <description>
      <![CDATA[ Mine planning concentrates on long-range production planning aimed at maximising the value derived from exploiting an ore deposit. However, by its very nature, because it is long-term, a mine plan can be affected by a variety of internal and external forces including, for example, increased knowledge of the orebody, unexpected staffing issues, technical advancements ,and changes in legislation, economy, and market.
 ]]>
      </description>
      <content:encoded>
      <![CDATA[ 
Mining, like many industries, can be slow to change. We often stick to traditional processes far longer than we should because we are comfortable with them and don’t want to take the risk of trying out new ones. But that means we may also miss significant opportunities both to improve profits and increase sustainability.



A common refrain from mine planners is that they do not have enough time to look at all the possible options for a solution space, which leaves them with sleepless nights worrying about such questions as: How much value was left in the last untested cut-off grade or mining capacity limit? Which direction and sequence would have created the best schedule? Have we followed the pit optimisation angles closely enough?



The fact is, however, that this situation can be solved — and these important questions can be answered — if we adopt a different approach to strategic mine planning.



Strategic mine planning



Mine planning concentrates on long-range production planning aimed at maximising the value derived from exploiting an ore deposit. However, by its very nature, because it is long-term, a mine plan can be affected by a variety of internal and external forces including, for example, increased knowledge of the orebody, unexpected staffing issues, technical advancements ,and changes in legislation, economy, and market.



Strategic mine planning attempts to de-risk a mine plan, to make it flexible enough to adapt to changes as and when they rise.



Traditional approach



The traditional approach to developing a mine plan is to assess a mine project based on the net present value (NPV). That means the NPV, which is calculated by applying a rate to progressively discount cash flows based both on how much profit the mine project must make and on its risks, becomes the primary KPI for the mine plan and drives decisions about where to start the extraction and how to orient the sequence.



For open pit mines, mine planners traditionally define reserves using the Lerchs-Grossmann (LG) algorithm, which identifies an economic envelope (pit shell), constrained to maximum slope angles, that will maximise the total undiscounted cash flow. With that final pit identified, the planner builds a sequence to reach the final pit often by creating nested pit shells using the same algorithm but constraining the volume of the output envelopes or adjusting the block model valuation using revenue factors (RFs). To select a subset of the nested pits to serve as pushback expansions toward the final pit, the mine planner then calculates the preliminary schedules.



Issue with this approach



The issue with this traditional approach is that most of the time, the nested shells available for the planner to select as pushbacks are not operationally feasible, and that may in turn require:




mining multiple satellite pits in earlier periods of the life of the mine



having a large starter pit, even for small revenue factor increments



following a concentric sequence, which requires multiple mining fronts, and/or



awkward pushback shapes and sizes, which may be difficult to implement.




Often, the planner will try to override these issues by building some feasible pushback designs loosely based on a set of nested pit shells and by splitting and merging different envelopes. However, this often seals the decision to use a pushback sequence based on the RF-limited pit shells instead of looking for other possible sequences towards the same final envelope. Plus, as a side effect, because the traditional approach is based on maximising undiscounted cash flow for simulated price-levels through different RFs, there is no guarantee that the sequence obtained will maximise NPV and could be out of alignment with other feasibility-focused KPIs.



A more flexible approach



Using process-automation tools with mine planning software allows mine planners to better appraise the optimisation solution space, delivering a workflow such as this:




1. Generate a ‘value map’ based on a modified pit optimisation algorithm that allows the planner to easily:





compare directional approaches while taking into account other vital components of a mine plan, such as spatial constraints, sinking rate and other feasibility KPIs as well as NPV.



identify the best starting region and corresponding directions, based on an assessment of preliminary strategic schedules for each combination, and




Figure 1. Optimised pit phases







2. Run thousands of possible scenarios based on mining rate and production capacity, their corresponding CAPEX and OPEX (making both the mine and the processing plant, and their corresponding costs, the right size), and cut-off grade, with each scenario producing its own mine plan and production schedule.



3. Optimise the schedule to maximise NPV by identifying what material to mine from each pushback and when, since the “what and when” will affect the mine’s order of revenues and costs (aka cashflow).



For example, the traditional approach has been to optimise the material send to the processing plant based purely on the mining and processing capacity. This can lead to low grade material taking up vital plant capacity in the early periods and reducing NPV among other KPIs.



A better approach would be to stockpile lower grade ore in the early years of production in order to prioritise higher-grade processing early on, and then use the remainder of the viable ore later, increasing NPV over the life of the mine. Even better still would be to not only optimise the processing capacity, cut-off grades, and stockpile usage, but to do this at the same time as choosing the sequence and the pit shells. This would free the optimisation to look at a wider solution space and not lock it in to decisions that were made in the previous step.



Figure 2. Strategic mine planning vision.







Strategic mine design



It is important to remember that optimisation and scheduling is only one side of the coin that is mine planning and that it takes a design to make a schedule actionable. In a process where we are creating large numbers of scenarios for optimisations and schedules, it is critical to establish a living design model with an&nbsp;intelligent workflow&nbsp;that updates as objects and inputs change.



Traditional CAD-based mine design works well but because it requires a designer to modify the entire shape of a design in response to a change, it is often slow, manual work that is prone to mistakes. This slowness can mean that a designer is able to produce just one or maybe two design options by deadline, with no time left for engineers to evaluate the integrity of the design.



Adding an automated parametric capability to the traditional design process not only ensures faster execution, it does so with improved accuracy and flexibility over traditional mine design.



Parametric design fundamentals



Parametric design&nbsp;does not produce a solution as much as generate a family of possible outcomes through dynamic automation.



Parametric modeling can use either a:




propagation-based system, where algorithms produce final shapes that are not predetermined by initial parametric inputs, or a



constraint system, where final constraints are set and algorithms define fundamentals (structures, material use, etc.) that satisfy these constraints.




Propagation-based systems often include ‘form-finding’ processes that optimise specific design goals against a set of design constraints, so that the final form of the designed object is ‘found’ based on these constraints.



Both types of parametric modeling have been used for years in other industries, such as civil construction, aviation, and manufacturing, as a replacement for traditional 3D CAD-based design.



Creating a living model



The parametric&nbsp;model-based approach&nbsp;incorporates traditional CAD functions but differs by adding links between objects and parameters.



This associativity preserves the connection between reference data — such as terrains and geology or resource models — and existing infrastructure models. This in turn allows the mine designer to update designs automatically every time there is new input data because, while the input data may have changed, the parameters of the design may not. The designer can also create templates by searching a series of functions&nbsp;and parameters, to speed up the time needed to design repetitive tasks, and deploy them manually or automatically through scripting.



The result is a “living” model where design changes made in a localised area will update the global mine design, and designs are ready for review days or even weeks faster than traditional practice allows.



Running limitless simulations



Mining projects are complicated, expensive, and extremely risky ventures. Being able to simulate everything from the&nbsp;mine design&nbsp;to the material movement in advance is critical to de-risking a project.



By&nbsp;automating the manual and iterative work done by the mine designer, parametric simulation enables the designer to compare the original design configuration with a larger spectrum of data. It works like this: regression models&nbsp;are first trained on simulation data and then progressively calibrated on measured data during a set monitoring period in order to (1) evaluate the&nbsp;robustness of design-phase performance and detect potentially critical assumptions, and (2) maintain a continuity with operation-phase performance with feed-back from measured data.



Applying simulations in real life



Parametric simulations can be used to design mining phases that consider unexpected variations and uncertainties, such as&nbsp;the metal content available in a mineral deposit and shifting commodity prices.



In the illustration below, we used a Design of Experiments (DoE) to perform a wide range of input modifications to a pit optimisation run. This allowed us to calculate tens of thousands of &nbsp;scenarios and explore the entire solution space, with the output being a dynamic set of pit shells linked to and associated with a set of pit design parameters. This associativity, coupled with parametric design, created the design shown below.



Figure 3. Optimised pit and haul road design.







The design now maintains a constant link with the optimisation results. As alternative scenarios are selected, new designs are automatically created and stored with their own revision and life cycle. We can also choose to link and associate them with other restriction criteria not made available to optimisation, such as pit crusher locations that require their own areas for infrastructure, flat areas in the ramp for regulatory purposes, or sump locations for pumping requirements. And we can assign a template to each of these criteria that is associated with the design and will be used to automatically update it.



Finally, each design can be used again within the life-of-mine scheduling, closing the planning loop and confirming the assumptions taken previously in the optimisation step.



Figure 4. Whittle and Process Composer Design of Experiments.







Conclusion



Strategic mine planning and parametric design are critical innovations at a time when mining companies are looking to reduce time to market and address marginal economic deposits, social, and ESG challenges.



If we can reduce our reliance on traditional mine planning tools and embrace new and innovative technologies, we will find the opportunities we need to move forward into a secure and responsible future.







Community is a place for GEOVIA users – from beginners to experts and everyone in between – to get answers to your questions, learn from each other, and network.&nbsp;Join our community to know more:



GEOVIA User Community&nbsp;–&nbsp;Read about industry topics from GEOVIA experts, be the first to know about new product releases and product tips and tricks, and share information and questions with your peers. All&nbsp;industry professionals are welcome to learn, engage, discover and share knowledge to shape a sustainable future of mining. &nbsp;



New member?&nbsp;Create an account, it’s free!&nbsp;Learn more about this community&nbsp;HERE.
 ]]>
      </content:encoded>
      </item>
    </channel>
   </rss>