In Java, any number written without a decimal point is assumed to be an
integer (int) value.

As soon as there is a decimal point written, the number is a floating
point (float) value.

Usually a **double **value is better than an int
value, because:

- double values can be
larger, up to 10
^{300}

- double can have decimals, and will allow correct division operations
**int**has values between -2,000,000,000 and +2,000,000,000 (approximately)

larger values cause an**overflow****int**variables cannot store fractions (decimals)

- Calculate the average of the following ages:

25 students are 15 years old

45 students are 16 years old

30 students are 17 years old - Add up the following:

1+3

1+3+5

1+3+5+7

...

1+3+5+7+....+97+99

Is there a pattern? - Explain why adding up int values
gives correct answers,

but dividing int values gives incorrect answers.