## Question

Given an integer `n`

, return *an array* `ans`

*of length* `n + 1`

*such that for each* `i`

(`0 <= i <= n`

)*,* `ans[i]`

*is the **number of*** 1**

*'s*

*in the binary representation of*

`i`

.## Algorithm

This question is actually asking us how do you count the 1's in an effective way.

By utilizing `n&n-1`

, we could zero out the least significant nonzero bit. Nothing to discuss, just a trick.

## Code

class Solution { public int[] countBits(int n) { int[] ans = new int[n + 1]; for (int i = 0; i <= n; i++) { ans[i] = helper(i); } return ans; } private int helper(int n) { int count = 0; while (n != 0) { count++; n &= n-1; // zeroing out the least significant nonzero bit } return count; } }