Last answered:

09 Nov 2022

Posted on:

03 Nov 2022

1

Why * 1.00? Around 5 mins in.

Could you elaborate why you will * 1.00 at the first count? I can't really get that. Thanks!

3 answers ( 0 marked as helpful)
Posted on:

07 Nov 2022

4

1.0 or 1.00 or 1.000, is the same thing.
She is using a decimal to force implicit cast to decimal from integer division.

This is because count() returns an integer value. One way to make it non-integer is to multiply by 1.0.
I hope I've answered your question.

Posted on:

09 Nov 2022

1

to convert , let say 12 into 12.0 , she multiplied that by 1.0 , now the success ratio, which should be in percentage(which can be in decimal) to bring that. she did so

Posted on:

09 Nov 2022

1

Thank you both for your answers!

Submit an answer