Calculate ordered arrangements with repetition (n^r). Supports variable pool sizes per position, position breakdown, entropy, and reference tables.
Permutations with repetition count ordered arrangements when each position can reuse the available options. For r positions with n choices each, the total is n^r. If the positions have different pool sizes, multiply them together: n₁ × n₂ × ... × nᵣ.
This calculator handles both the equal-pool case and mixed formats such as license plates, product codes, or any system where different positions draw from different character sets. It reports the total count, probability of one exact outcome, entropy in bits, and a position-by-position breakdown.
Typical use cases include PINs, passwords, dice rolls, serial-number formats, and any other ordered slot-filling problem where repetition is allowed.
The simple n^r formula covers many examples, but real counting problems often mix letters, digits, symbols, or category-specific slots. Breaking the total into per-position factors makes it easier to verify a format and explain where the size of the search space comes from.
Simple case (same pool): Total = n^r General case (different pools): Total = n₁ × n₂ × ... × nᵣ This is the multiplication principle (product rule). Entropy = log₂(Total) bits Examples: 4-digit PIN: 10^4 = 10,000 License AAA-1234: 26³ × 10⁴ = 175,760,000 3d6 dice: 6³ = 216
Result: 6 × 6 × 6 = 216
Rolling three six-sided dice gives 6³ = 216 possible outcomes. Each outcome (like 3-5-2) is equally likely with probability 1/216 ≈ 0.46%. The order matters: (1,2,3) and (3,2,1) are different outcomes.
The multiplication principle (product rule) states: if a process has k stages with n₁, n₂, ..., nₖ options respectively, the total outcomes are n₁ × n₂ × ... × nₖ. It's the basis for counting everything from passwords to genetic sequences (4^L for DNA of length L) to machine configurations.
n^r grows exponentially with r, which has profound implications for security (large password spaces), data encoding (bits per symbol), and computational complexity (search spaces). This growth is both a blessing (cryptographic security) and a curse (combinatorial explosion in optimization problems).
In information theory, the entropy of a random variable drawn uniformly from n^r outcomes is r × log₂(n) bits. This is the amount of information needed to specify one particular outcome. It connects counting (combinatorics) to communication (how many bits to transmit) and security (how hard to guess).
Last updated:
With repetition: each position can use any item (n^r total). Without repetition: once an item is used, it's removed from the pool (P(n,r) = n!/(n−r)!). A PIN allows repeats (1111 is valid); a race finishing order does not.
10^4 = 10,000 PINs (0000 through 9999). Each digit has 10 choices, and digits can repeat. Security relies on lockout policies, not combinatorial difficulty.
Because you multiply the number of choices at each step. If you choose a shirt (5 options), pants (3 options), and shoes (4 options), total outfits = 5 × 3 × 4 = 60.
A base-n number with r digits can represent exactly n^r values. Binary (n=2): 8 bits = 2^8 = 256 values. Decimal (n=10): 3 digits = 10^3 = 1,000 values. The counting is identical.
License plates, serial numbers, phone numbers, product codes, and mixed-format IDs often give different positions different character sets. In those cases, multiply the allowed options for each position instead of using one shared n value.
Exponentially in r. 2^10 = 1,024 but 2^20 = 1,048,576 and 2^30 = 1,073,741,824. Each additional position multiplies by n. This exponential growth makes brute-force attacks infeasible for large r.