Hey guys, haven’t been here in ages, but anywho, I’ve a huge problem.
What I’m trying to do is test the “dice” 20 times to check its randomness. The program is meant to record the most amount of times the numer appeared, the least amount of times the number occured, and the average.
I’ve managed to get the program to calculate the amount of times a number occured, but I’m having trouble with the average, and the most amount of times it occured and the least amount of times it occured.
public class TestRandomness
{
static int rolledNumber()
{
int rolledNumber = (int) (1 + (Math.random() * 6));
return rolledNumber;
}
static double getAverage()
{
double takenAverage;
for(int r = 1; r <= 20; r++)
{
takenAverage = ((counts[r] + 1) / 6);
}
return takenAverage;
}
public static void main(String [] args)
{
System.out.println("How many times do you want to throw the dice: ");
int UserAmount = Console.readInt();
int [] counts = new int [7];
int [] averages = new int [7];
for(int y = 0; y <= 20; y++)
{
for(int i = 1; i <= UserAmount; i++)
{
int returnedNumber = rolledNumber();
counts[returnedNumber] = counts[returnedNumber] + 1;
}
}
for(int x = 1; x <= 6; x++)
{
System.out.println(x + ": " + counts[x]);
}
}
}
I would try and figure it out myself, I truly would, but I’ve run out of time, and desperately need it done!
Thanks for any help guys!
Hauk