Hi All,
I’m trying to fill up a grid of cells with random objects, but certain objects need to have a higher chance of generating than others.
So, imagining you have a 5x5 grid of cells, I could say that I want 50% of the cells to be empty, and the other 50% to contain something (potentially).
I made a little diagram showing kind of how the distribution of values works:
Here’s where I get confused.
When I write the code to determine whether or not a cell is empty or not, I can do something like this:
int chanceForSomething = 50;
Random rnd = new Random();
if (rnd.nextInt(100) < chanceForSomething)
{
// cell isn't empty, generate something here!
}
But what confuses me is how I then further determine what to generate, based on their individual chances of generating? Looking at the percentage graph above, I can see how the lower 50% could be broken up into their own percentages (20% becomes a 40% chance, 5% becomes 10% etc.). However, if I were to do a calculation like this:
int chanceForHuge = 40;
int chanceForLarge = 30;
int chanceForMedium = 10;
int chanceForSmall = 10;
int chanceForTiny = 10;
Random rnd = new Random();
int rndVal = rnd.nextInt(100); // how does this distribute against the chance probabilities?
How do I work out what chance probability ‘rndVal’ matches? Am I thinking about this all wrong?
I want larger objects to spawn more frequently than smaller ones, so they should have a higher chance of being created. But I also need potential room to add more objects to this list of probabilities in future, without having to resort to ‘baking’ the percentage chance check with code like ‘if (chance >= 5 && chance < 10) …’. That seems like an unnecessarily ‘dirty’ way of handling this.
Any helpful insight into this would be greatly appreciated. Thanks!