“Anonymous fan” writes:
Key question: How much gold is in the 100 billion tons (or whatever) of rock in the valley?
Best answer: Dig up and process all the rock. Then you know for sure!
2nd best answer: Pick 1,000 random spots in the valley. (Even better, don’t make them purely random. Ensure that they cover the valley thoroughly.) Bore down to the proposed max depth of the mine. Process those samples. You might get “lucky”, in that, by chance, you pick some gold heavy spots to dig. Major problem/fraud-opportunity for this approach is that the spots must be random. If you just pick spots that you think have gold, even though they are not representative of the whole valley, then your estimate will be too high.
3rd best answer: Sample from the sample. You don’t want to wait for all 1,000 samples to come in, nor do you want to process all that rock. So, you sample from the sample by randomly taking a sample from some of the 1,000 cores. Of course, you need to worry about biases. Don’t take only from the top of each sample and so on. But, if everything is random, this should also give you a good, but less precise, estimate of the 1,000 cores you plan on drilling and of the whole valley.
All this is background for my attempt to square the circle of what Strathcona and Pretium said, along with an assumption of no and/or cheap fraud. That is, no salting of gold.
Thesis: Strathcona’s results (3rd best answer) are correct. There was no gold in what they sampled, a small portion of the overall sample. Pretium’s results (2nd best answer) are correct. There is gold in the full sample. But Pretium cheated by not truly taking a random sample like you are supposed to. That is, maybe after Strathcona left, they changed the places they were going to sample. It was no longer random. Instead, they found some clumped gold deposits, largely unrepresentative of the entire valley, and purposely dug them up. This isn’t Bre-X type fraud. There is gold in that sample. But that sample is totally unrepresentative of the entire valley, so PVG is still worthless.
Do you think that is a plausible scenario?
They actually don’t want to sample the rock randomly. If the deposit is mined, they want to mine the mineralized ore and as little waste rock as possible. Ideally, they want to simulate actual mining. A commercial mine will mine most of the mineralized ore and inevitably some waste rock (mining dilution will be unavoidable in Pretium’s case). So, they want to be sampling the mineralized ore and a little bit of the waste rock.
(*Some of the waste rock needs to be sampled because much of it will end up in the processing plant and affect how well the processing plant can separate the precious metals from all the of waste. Waste rock has minor relevance to the overall economics of a mine. This is not something I’m really worried about.)
there are different issues here:
- (A) They want to figure out where the mineralized ore is.
- (B) They want to know the grade of the mineralized ore.
- (C) To figure out the metallurgy for the processing plant, they will need a lot of material that is no grade, low grade, and high grade.
In a commercial mine, (A) will be determined through past and/or on-going exploration drilling. Based on all of the past drillhole data, they will have figured out the correlation between gold and the type of rock that hosts the gold (and silver and other metals that may be worth recovering). This correlation won’t be perfect but hopefully the correlation is very strong. From the exploration drilling, they should be able to figure out where the mineralized ore is.
One of the concerns that Strathcona had is that there might not be continuity in the veins. The resource model makes predictions about where the mineralized ore should be. When they do the bulk sample, hopefully the mineralized ore is where it should be. If the resource model isn’t very good at predicting where the mineralized ore is, then the company will need to drill more holes spaced closer together to figure out where it is. This would raise the costs of a commercial mine a little bit. A second consequence is that the resource model may be overstating or understating the tonnage of the deposit.
On to (B). There are some gold deposits that are very heterogenous or have a high “nugget effect”. They call it a nugget effect because the gold tends to clump up into nuggets. If a drillhole hits a nugget, the grades will be extremely high. If it doesn’t, the grades will be extremely low. There can even be a nugget effect in both halves/splits of a drillcore. Because of this, it is very difficult to estimate the grade of the deposit accurately based on the drillcore data alone. A bulk sample will give a much more accurate picture of the grade than the drillcore data.
On to (C). See my post on mineral processing for an explanation as to why they need a lot of material from a bulk sample. The reason why they need to test low grade ore is because the processing plant should be designed to do well for both low and high grade ore. Later on, a future mine/mill would process low grade ore if it that ore is economic. I believe the Pretium bulk sample was designed to grab material over a range of grades. This is normal.
To answer your question, the bulk sample is not random and is not supposed to be random.
That is, maybe after Strathcona left, they changed the places they were going to sample.
This seems extremely unlikely to me. For starters, the mining crew would know. It would be extremely weird if there is material that isn’t being run through the sample tower, if they are mining where they aren’t supposed to mine, or if they are mining past their permitted 10,000t. On top of that, they would need to mine millions of dollars of gold. There has to be a lot of gold in the deposit to begin with. The idea seems contrived and very far fetched to me.