# Why * 1.00? Around 5 mins in.

Could you elaborate why you will * 1.00 at the first count? I can't really get that. Thanks!

3 answers ( 0 marked as helpful)

1.0 or 1.00 or 1.000, is the same thing.

She is using a decimal to force implicit cast to decimal from integer division.

This is because `count()`

returns an integer value. One way to make it non-integer is to multiply by `1.0.`

I hope I've answered your question.

to convert , let say 12 into 12.0 , she multiplied that by 1.0 , now the success ratio, which should be in percentage(which can be in decimal) to bring that. she did so

Thank you both for your answers!